Lobechat is unable to view the locally installed models #3219
Amazon90
started this conversation in
General | 讨论
Replies: 1 comment
-
This is a really bad project! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Description:
I can find the models I downloaded in OpenWebUI, but there are no downloaded local models in LobeChat. When I ask questions, an error message appears, but the connectivity check for Ollama is passed. I can't find a solution to this issue in the installation guide of the project.
Recurring issue:
Ollama version 0.2.5 for Windows
OS Windows 11 Pro 23H2
Added user variable (U) for environment variables: OLLAMA_ORIGINS=*, and restarted the computer.
Beta Was this translation helpful? Give feedback.
All reactions