Skip to content

Commit

Permalink
Merge pull request #9 from matlab-deep-learning/dev-update-gpt-4-visi…
Browse files Browse the repository at this point in the history
…on-preview

Dev update gpt 4 vision preview
  • Loading branch information
debymf authored Feb 13, 2024
2 parents a038630 + 845c874 commit ab154ef
Show file tree
Hide file tree
Showing 5 changed files with 4 additions and 8 deletions.
2 changes: 1 addition & 1 deletion +llms/+internal/callOpenAIChatAPI.m
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@

nvpOptions = keys(dict);
if strcmp(nvp.ModelName,'gpt-4-vision-preview')
nvpOptions(ismember(nvpOptions,["MaxNumTokens","StopSequences"])) = [];
nvpOptions(ismember(nvpOptions,"StopSequences")) = [];
end

for opt = nvpOptions.'
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
*.env
*.asv
*.mat
startup.m
Binary file added examples/ExampleEmbeddings.mlx
Binary file not shown.
Binary file modified examples/ExampleGPT4Vision.mlx
Binary file not shown.
9 changes: 2 additions & 7 deletions openAIChat.m
Original file line number Diff line number Diff line change
Expand Up @@ -209,8 +209,8 @@
% reproducible responses
%
% Currently, GPT-4 Turbo with vision does not support the message.name
% parameter, functions/tools, response_format parameter, stop
% sequences, and max_tokens
% parameter, functions/tools, response_format parameter, and stop
% sequences. It also has a low MaxNumTokens default, which can be overridden.

arguments
this (1,1) openAIChat
Expand All @@ -221,11 +221,6 @@
nvp.Seed {mustBeIntegerOrEmpty(nvp.Seed)} = []
end

if nvp.MaxNumTokens ~= Inf && strcmp(this.ModelName,'gpt-4-vision-preview')
error("llms:invalidOptionForModel", ...
llms.utils.errorMessageCatalog.getMessage("llms:invalidOptionForModel", "MaxNumTokens", this.ModelName));
end

toolChoice = convertToolChoice(this, nvp.ToolChoice);
if ~isempty(nvp.ToolChoice) && strcmp(this.ModelName,'gpt-4-vision-preview')
error("llms:invalidOptionForModel", ...
Expand Down

0 comments on commit ab154ef

Please sign in to comment.