Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev update gpt 4 vision preview #9

Merged
merged 2 commits into from
Feb 13, 2024
Merged

Conversation

toshiakit
Copy link
Collaborator

Added the embedding example and removed MaxNumTokens restrictions for gpt-4-vision preview in openAIChat and callOpenAIChat.

@debymf debymf merged commit ab154ef into main Feb 13, 2024
1 check passed
@debymf debymf deleted the dev-update-gpt-4-vision-preview branch February 13, 2024 17:12
ccreutzi added a commit that referenced this pull request Sep 13, 2024
Adding support for [⁠the new o1-preview and o1-mini models](https://openai.com/index/introducing-openai-o1-preview/).

Note that these two no longer support the name we translate our `MaxNumTokens` into. Fortunately, the new name does get accepted in the older models, too.

```
================================================================================
Error occurred in topenAIChat/canUseModel(ModelName=o1-preview) and it did not run to completion.
    ---------
    Error ID:
    ---------
    'llms:apiReturnedError'
    --------------
    Error Details:
    --------------
    Error using openAIChat/generate (line 259)
    Server returned error indicating: "Unsupported parameter: 'max_tokens' is not supported with this model. Use
    'max_completion_tokens' instead."

    Error in topenAIChat/canUseModel (line 100)
                testCase.verifyClass(generate(openAIChat(ModelName=ModelName),"hi",MaxNumTokens=1),"string");
================================================================================
```
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants