-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pin litellm version 1.50.0 while 1.50.2 is getting fixed #1494
Comments
@blakkd Is this still an issue on the latest version (0.4.3) when you run |
I realize I never tried the interactive option before!
Here are the logs: litellm
without --local
with --local
ollama serve
|
Fixed with the litellm bump to 1.52.0 :) |
Describe the bug
Instead of showing code blocks, it shows
for example
Reproduce
Expected behavior
Screenshots
No response
Open Interpreter version
0.4.0
Python version
3.11.10
Operating System name and version
Ubuntix
Additional context
As seen, downgrading litellm from 1.50.2 to 1.50.0 fixes the issue.
I only see the behavior with an ollama instance, and not with groq for example.
I guess it's on litellm side, but it might be to consider to pin the litellm version for now because it makes it unsusable for local ollama users in the current state.
That's just a suggestion, I don't know how you deal with such situation usually.
Cheers mates
EDIT: Sorry, the copy paste was from an earlier version of OI, I might have messed up my windows or something. But the behavior of course remains there on 0.4.0
The text was updated successfully, but these errors were encountered: