-
Can I use a LLM not listed in the settings? Currently have Phi-3 running using vLLM but can't find it as a listed model. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
It's possible that Settings don't show in the list all supported models. Try entering the name in the box manually, and then save. But you may want to make sure you follow the pattern you need for the name, in this case I'm guessing you need to try something like 'openai/modelname', at least according to liteLLM documentation that I linked in your other discussion. Please try to refer to that documentation, it is LiteLLM that will recognize the model and send the call to it. |
Beta Was this translation helpful? Give feedback.
It's possible that Settings don't show in the list all supported models. Try entering the name in the box manually, and then save. But you may want to make sure you follow the pattern you need for the name, in this case I'm guessing you need to try something like 'openai/modelname', at least according to liteLLM documentation that I linked in your other discussion. Please try to refer to that documentation, it is LiteLLM that will recognize the model and send the call to it.