-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ollama backend does not support api keys #106
Comments
Ok, so I made a bit of a progress. I found out the open webui is not really openai compatible as I thought. Not sure where I got that impression. I can use the /ollama endpoint to authenticate using a token and pass down my query. Right now, it boils down to:
But hitting the API from a script seems to return exactly the same response:
(I copied the API endpoint URL and query body from llm-ls log). For some reason, when going through webui I get the unexpected error. Looking at the source, it appears that ollama backend does not pass down the api token, even if provided. This limits the use of open webui, which does a great job securing the ollama instance and allowing me to use my selfhosted llm server when traveling. |
Hi!
I'm trying to set up llm-ls via llm.nvim plugin and I'm hitting weird serdes errors. I set the following config:
Weird thing is, completion seems to work just fine. But after each completion, I get a serdes error per following log. Any idea what am I doing wrong? I find it hard to find docs on using open webui as a backend, as well as the open webui docs for their API (I believe it is supposed to be openai compatible, hence I set it so).
Relevant logs (api key stripped):
... If you've got suggestions on a better alternative to open webui, I might consider it. But I want to open the server to a few friends and don't want the hassle of manually managing api keys for raw ollama instance (that works fine btw.).
Any help would be appreciated!
The text was updated successfully, but these errors were encountered: