Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: The Anthropic provider does not support vision #3792

Closed
2 tasks done
amanape opened this issue Sep 9, 2024 · 6 comments
Closed
2 tasks done

[Bug]: The Anthropic provider does not support vision #3792

amanape opened this issue Sep 9, 2024 · 6 comments
Assignees
Labels
bug Something isn't working frontend Related to frontend javascript Pull requests that update Javascript code small effort Estimated small effort

Comments

@amanape
Copy link
Member

amanape commented Sep 9, 2024

Is there an existing issue for the same bug?

Describe the bug

The current model ID's used are incorrect and do not support vision.

Current OpenHands version

0.9.2

Installation and Configuration

N/A

Model and Agent

Anthropic models

Operating System

osx

Reproduction Steps

  1. Select Anthropic provider
  2. A model that is supposed to support vision
  3. Upload images
  4. Try to send

Logs, Errors, Screenshots, and Additional Context

No response

@amanape amanape added bug Something isn't working frontend Related to frontend javascript Pull requests that update Javascript code small effort Estimated small effort labels Sep 9, 2024
@amanape amanape self-assigned this Sep 9, 2024
@amanape
Copy link
Member Author

amanape commented Sep 9, 2024

#3773 (comment)

@enyst
Copy link
Collaborator

enyst commented Sep 9, 2024

I'll test for openAI in a bit. FWIW I am not sure FE should change this, it feels like the right thing to try as best as we can to standardize with provider. More like, this pattern could be added to liteLLM's list, since it supports it, it just doesn't know it has vision if we write it this way, but it does know if we write it without "anthropic/"... hah. Maybe we can fix it somehow first, though, pending theirs.

@ColeMurray
Copy link
Contributor

ColeMurray commented Sep 10, 2024

Looking at the list of prices, this would be broken for OpenAI as well:
https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json

All OAI models do not have a provider/ prefix.

I think it would make sense to request the additions within the litellm repo to standardize and prevent the need to maintaining a mapping within this repo.

Created an issue here: BerriAI/litellm#5608

@ColeMurray
Copy link
Contributor

I've made a pull request in litellm that will resolve this:
BerriAI/litellm#5688

@amanape
Copy link
Member Author

amanape commented Sep 14, 2024

@ColeMurray That is awesome! Thanks a lot

@tobitege
Copy link
Collaborator

That's really super, hopefully it'll get the ✅ soon. Thanks @ColeMurray !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working frontend Related to frontend javascript Pull requests that update Javascript code small effort Estimated small effort
Projects
None yet
Development

No branches or pull requests

4 participants