-
Notifications
You must be signed in to change notification settings - Fork 227
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add chat completion method #645
Conversation
One question @coyotte508 , do we want to abstract the
|
The only issue is now all models are served/compatible with |
@radames not quite exactly. TGI-served models expose a |
Oh and btw, TGI-served models have a |
thanks @Wauplin, yes I miswrote my sentence, and I wanted to say that not all models are served with and I think we should copy this logic to complete the model url, in case ones provide only a model id model_url = self._resolve_url(model)
if not model_url.endswith("/chat/completions"):
model_url += "/v1/chat/completions" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've add chatCompletion
hint, feel free to change if there is a better way to send that to the request call
Can you update the tests in (also check my README changes) It should be good afterwards |
@coyotte508 for the external providers, it has to be |
Need to see exactly what options is sent to other endpoints and if they can be removed
Co-authored-by: Julien Chaumond <[email protected]>
@coyotte508 here is the issue, when we use predefined tasks, such as huggingface.js/packages/inference/src/lib/makeRequestOptions.ts Lines 114 to 117 in b66fcf3
|
Yes I just want to know what the value of We could also detect mistral/openai in the domain name and remove the options in that case |
Great point. We might not need otherOptions. It seems like we can do it via headers, and we're already doing it. |
@radames can you run tests again with Hopefully it should be good afterwards |
thanks @coyotte508 done! I ran the tests only for the new api |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yay! this is close to being merged? i want to play with it:)
I think is just the final ok from @coyotte508 it's ready to be merged and released. I'm also waiting for this, to build a couple of things. |
Thanks @radames ! |
Thanks to #645, it is now possible to use `chatCompletionStream` in Conversational Widget ![image](https://github.com/huggingface/huggingface.js/assets/11827707/efa05c3d-5a14-4564-9d50-40de25f0a21b)
Supersede #581. Thanks to @Wauplin, I can import the types from "@huggingface/tasks"
I've followed the pattern for
textGeneration
andtextGenerationStream
.