Skip to content

Commit

Permalink
Fix capitalization: APIKey, by MathWorks naming standards.
Browse files Browse the repository at this point in the history
This is *not* a breaking change, because we have case-insensitive(and partial)
matching for these names.
  • Loading branch information
ccreutzi committed Jun 6, 2024
1 parent 8d50886 commit 1a5ebb9
Show file tree
Hide file tree
Showing 15 changed files with 145 additions and 145 deletions.
6 changes: 3 additions & 3 deletions +llms/+internal/callAzureChatAPI.m
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@
% apiKey = "your-api-key-here"
%
% % Send a request
% [text, message] = llms.internal.callAzureChatAPI(messages, functions, ApiKey=apiKey)
% [text, message] = llms.internal.callAzureChatAPI(messages, functions, APIKey=apiKey)

% Copyright 2023-2024 The MathWorks, Inc.

Expand All @@ -55,7 +55,7 @@
nvp.FrequencyPenalty
nvp.ResponseFormat
nvp.Seed
nvp.ApiKey
nvp.APIKey
nvp.TimeOut
nvp.StreamFun
end
Expand All @@ -64,7 +64,7 @@

parameters = buildParametersCall(messages, functions, nvp);

[response, streamedText] = llms.internal.sendRequest(parameters,nvp.ApiKey, URL, nvp.TimeOut, nvp.StreamFun);
[response, streamedText] = llms.internal.sendRequest(parameters,nvp.APIKey, URL, nvp.TimeOut, nvp.StreamFun);

% If call errors, "choices" will not be part of response.Body.Data, instead
% we get response.Body.Data.error
Expand Down
6 changes: 3 additions & 3 deletions +llms/+internal/callOpenAIChatAPI.m
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@
% apiKey = "your-api-key-here"
%
% % Send a request
% [text, message] = llms.internal.callOpenAIChatAPI(messages, functions, ApiKey=apiKey)
% [text, message] = llms.internal.callOpenAIChatAPI(messages, functions, APIKey=apiKey)

% Copyright 2023-2024 The MathWorks, Inc.

Expand All @@ -53,7 +53,7 @@
nvp.FrequencyPenalty
nvp.ResponseFormat
nvp.Seed
nvp.ApiKey
nvp.APIKey
nvp.TimeOut
nvp.StreamFun
end
Expand All @@ -62,7 +62,7 @@

parameters = buildParametersCall(messages, functions, nvp);

[response, streamedText] = llms.internal.sendRequest(parameters,nvp.ApiKey, END_POINT, nvp.TimeOut, nvp.StreamFun);
[response, streamedText] = llms.internal.sendRequest(parameters,nvp.APIKey, END_POINT, nvp.TimeOut, nvp.StreamFun);

% If call errors, "choices" will not be part of response.Body.Data, instead
% we get response.Body.Data.error
Expand Down
6 changes: 3 additions & 3 deletions +llms/+internal/getApiKeyFromNvpOrEnv.m
Original file line number Diff line number Diff line change
Expand Up @@ -4,15 +4,15 @@
%getApiKeyFromNvpOrEnv Retrieves an API key from a Name-Value Pair struct or environment variable.
%
% This function takes a struct nvp containing name-value pairs and checks if
% it contains a field called "ApiKey". If the field is not found, the
% it contains a field called "APIKey". If the field is not found, the
% function attempts to retrieve the API key from an environment variable
% whose name is given as the second argument. If both methods fail, the
% function throws an error.

% Copyright 2023-2024 The MathWorks, Inc.

if isfield(nvp, "ApiKey")
key = nvp.ApiKey;
if isfield(nvp, "APIKey")
key = nvp.APIKey;
else
if isenv(envVarName)
key = getenv(envVarName);
Expand Down
2 changes: 1 addition & 1 deletion +llms/+internal/textGenerator.m
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
properties (Access=protected)
Tools
FunctionsStruct
ApiKey
APIKey
StreamFun
end
end
2 changes: 1 addition & 1 deletion +llms/+utils/errorMessageCatalog.m
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@
catalog("llms:assistantMustHaveTextNameAndArguments") = "Fields 'name' and 'arguments' must be text with one or more characters.";
catalog("llms:mustBeValidIndex") = "Value is larger than the number of elements in Messages ({1}).";
catalog("llms:stopSequencesMustHaveMax4Elements") = "Number of elements must not be larger than 4.";
catalog("llms:keyMustBeSpecified") = "Unable to find API key. Either set environment variable {1} or specify name-value argument ""ApiKey"".";
catalog("llms:keyMustBeSpecified") = "Unable to find API key. Either set environment variable {1} or specify name-value argument ""APIKey"".";
catalog("llms:mustHaveMessages") = "Value must contain at least one message in Messages.";
catalog("llms:mustSetFunctionsForCall") = "When no functions are defined, ToolChoice must not be specified.";
catalog("llms:mustBeMessagesOrTxt") = "Messages must be text with one or more characters or an openAIMessages objects.";
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -340,7 +340,7 @@ messages = addUserMessageWithImages(messages,"What is in the image?",image_path)

## Establishing a connection to Chat Completions API using Azure

If you would like to connect MATLAB to Chat Completions API via Azure instead of directly with OpenAI, you will have to create an `azureChat` object. See [the Azure documentation](https://learn.microsoft.com/en-us/azure/ai-services/openai/chatgpt-quickstart) for details on the setup required and where to find your key, endpoint, and deployment name. As explained above, the key should be in the environment variable `AZURE_OPENAI_API_KEY`, or provided as `ApiKey=…` in the `azureChat` call below.
If you would like to connect MATLAB to Chat Completions API via Azure instead of directly with OpenAI, you will have to create an `azureChat` object. See [the Azure documentation](https://learn.microsoft.com/en-us/azure/ai-services/openai/chatgpt-quickstart) for details on the setup required and where to find your key, endpoint, and deployment name. As explained above, the key should be in the environment variable `AZURE_OPENAI_API_KEY`, or provided as `APIKey=…` in the `azureChat` call below.

In order to create the chat assistant, you must specify your Azure OpenAI Resource and the LLM you want to use:
```matlab
Expand Down
8 changes: 4 additions & 4 deletions azureChat.m
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
% ResponseFormat - The format of response the model returns.
% "text" (default) | "json"
%
% ApiKey - The API key for accessing the OpenAI Chat API.
% APIKey - The API key for accessing the OpenAI Chat API.
%
% PresencePenalty - Penalty value for using a token in the response
% that has already been used. Default value is 0.
Expand Down Expand Up @@ -91,7 +91,7 @@
nvp.TopProbabilityMass {llms.utils.mustBeValidTopP} = 1
nvp.StopSequences {llms.utils.mustBeValidStop} = {}
nvp.ResponseFormat (1,1) string {mustBeMember(nvp.ResponseFormat,["text","json"])} = "text"
nvp.ApiKey {mustBeNonzeroLengthTextScalar}
nvp.APIKey {mustBeNonzeroLengthTextScalar}
nvp.PresencePenalty {llms.utils.mustBeValidPenalty} = 0
nvp.FrequencyPenalty {llms.utils.mustBeValidPenalty} = 0
nvp.TimeOut (1,1) {mustBeReal,mustBePositive} = 10
Expand Down Expand Up @@ -129,7 +129,7 @@
this.StopSequences = nvp.StopSequences;
this.PresencePenalty = nvp.PresencePenalty;
this.FrequencyPenalty = nvp.FrequencyPenalty;
this.ApiKey = llms.internal.getApiKeyFromNvpOrEnv(nvp,"AZURE_OPENAI_API_KEY");
this.APIKey = llms.internal.getApiKeyFromNvpOrEnv(nvp,"AZURE_OPENAI_API_KEY");
this.TimeOut = nvp.TimeOut;
end

Expand Down Expand Up @@ -185,7 +185,7 @@
StopSequences=this.StopSequences, MaxNumTokens=nvp.MaxNumTokens, ...
PresencePenalty=this.PresencePenalty, FrequencyPenalty=this.FrequencyPenalty, ...
ResponseFormat=this.ResponseFormat,Seed=nvp.Seed, ...
ApiKey=this.ApiKey,TimeOut=this.TimeOut, StreamFun=this.StreamFun);
APIKey=this.APIKey,TimeOut=this.TimeOut, StreamFun=this.StreamFun);
end
end

Expand Down
4 changes: 2 additions & 2 deletions extractOpenAIEmbeddings.m
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
%
% 'ModelName' - The ID of the model to use.
%
% 'ApiKey' - OpenAI API token. It can also be specified by
% 'APIKey' - OpenAI API token. It can also be specified by
% setting the environment variable OPENAI_API_KEY
%
% 'TimeOut' - Connection Timeout in seconds (default: 10 secs)
Expand All @@ -28,7 +28,7 @@
"text-embedding-3-large", "text-embedding-3-small"])} = "text-embedding-ada-002"
nvp.TimeOut (1,1) {mustBeReal,mustBePositive} = 10
nvp.Dimensions (1,1) {mustBeInteger,mustBePositive}
nvp.ApiKey {llms.utils.mustBeNonzeroLengthTextScalar}
nvp.APIKey {llms.utils.mustBeNonzeroLengthTextScalar}
end

END_POINT = "https://api.openai.com/v1/embeddings";
Expand Down
4 changes: 2 additions & 2 deletions functionSignatures.json
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
{"name":"TopProbabilityMass","kind":"namevalue","type":["numeric","scalar",">=0","<=1"]},
{"name":"StopSequences","kind":"namevalue","type":["string","vector"]},
{"name":"ResponseFormat","kind":"namevalue","type":"choices={'text','json'}"},
{"name":"ApiKey","kind":"namevalue","type":["string","scalar"]},
{"name":"APIKey","kind":"namevalue","type":["string","scalar"]},
{"name":"PresencePenalty","kind":"namevalue","type":["numeric","scalar","<=2",">=-2"]},
{"name":"FrequencyPenalty","kind":"namevalue","type":["numeric","scalar","<=2",">=-2"]},
{"name":"TimeOut","kind":"namevalue","type":["numeric","scalar","real","positive"]},
Expand Down Expand Up @@ -53,7 +53,7 @@
{"name":"TopProbabilityMass","kind":"namevalue","type":["numeric","scalar",">=0","<=1"]},
{"name":"StopSequences","kind":"namevalue","type":["string","vector"]},
{"name":"ResponseFormat","kind":"namevalue","type":"choices={'text','json'}"},
{"name":"ApiKey","kind":"namevalue","type":["string","scalar"]},
{"name":"APIKey","kind":"namevalue","type":["string","scalar"]},
{"name":"PresencePenalty","kind":"namevalue","type":["numeric","scalar","<=2",">=-2"]},
{"name":"FrequencyPenalty","kind":"namevalue","type":["numeric","scalar","<=2",">=-2"]},
{"name":"TimeOut","kind":"namevalue","type":["numeric","scalar","real","positive"]},
Expand Down
30 changes: 15 additions & 15 deletions openAIChat.m
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
% CHAT = openAIChat(systemPrompt) creates an openAIChat object with the
% specified system prompt.
%
% CHAT = openAIChat(systemPrompt,ApiKey=key) uses the specified API key
% CHAT = openAIChat(systemPrompt,APIKey=key) uses the specified API key
%
% CHAT = openAIChat(systemPrompt,Name=Value) specifies additional options
% using one or more name-value arguments:
Expand Down Expand Up @@ -68,14 +68,14 @@

% Copyright 2023-2024 The MathWorks, Inc.

properties(SetAccess=private)
properties(SetAccess=private)
%MODELNAME Model name.
ModelName
end


methods
function this = openAIChat(systemPrompt, nvp)
function this = openAIChat(systemPrompt, nvp)
arguments
systemPrompt {llms.utils.mustBeTextOrEmpty} = []
nvp.Tools (1,:) {mustBeA(nvp.Tools, "openAIFunction")} = openAIFunction.empty
Expand All @@ -84,7 +84,7 @@
nvp.TopProbabilityMass {llms.utils.mustBeValidTopP} = 1
nvp.StopSequences {llms.utils.mustBeValidStop} = {}
nvp.ResponseFormat (1,1) string {mustBeMember(nvp.ResponseFormat,["text","json"])} = "text"
nvp.ApiKey {mustBeNonzeroLengthTextScalar}
nvp.APIKey {mustBeNonzeroLengthTextScalar}
nvp.PresencePenalty {llms.utils.mustBeValidPenalty} = 0
nvp.FrequencyPenalty {llms.utils.mustBeValidPenalty} = 0
nvp.TimeOut (1,1) {mustBeReal,mustBePositive} = 10
Expand All @@ -105,7 +105,7 @@
this.Tools = nvp.Tools;
[this.FunctionsStruct, this.FunctionNames] = functionAsStruct(nvp.Tools);
end

if ~isempty(systemPrompt)
systemPrompt = string(systemPrompt);
if systemPrompt ~= ""
Expand All @@ -124,7 +124,7 @@

this.PresencePenalty = nvp.PresencePenalty;
this.FrequencyPenalty = nvp.FrequencyPenalty;
this.ApiKey = llms.internal.getApiKeyFromNvpOrEnv(nvp,"OPENAI_API_KEY");
this.APIKey = llms.internal.getApiKeyFromNvpOrEnv(nvp,"OPENAI_API_KEY");
this.TimeOut = nvp.TimeOut;
end

Expand All @@ -143,13 +143,13 @@
% MaxNumTokens - Maximum number of tokens in the generated response.
% Default value is inf.
%
% ToolChoice - Function to execute. 'none', 'auto',
% ToolChoice - Function to execute. 'none', 'auto',
% or specify the function to call.
%
% Seed - An integer value to use to obtain
% reproducible responses
%
% Currently, GPT-4 Turbo with vision does not support the message.name
%
% Currently, GPT-4 Turbo with vision does not support the message.name
% parameter, functions/tools, response_format parameter, and stop
% sequences. It also has a low MaxNumTokens default, which can be overridden.

Expand All @@ -165,7 +165,7 @@
toolChoice = convertToolChoice(this, nvp.ToolChoice);

if isstring(messages) && isscalar(messages)
messagesStruct = {struct("role", "user", "content", messages)};
messagesStruct = {struct("role", "user", "content", messages)};
else
messagesStruct = messages.Messages;
end
Expand All @@ -175,14 +175,14 @@
if ~isempty(this.SystemPrompt)
messagesStruct = horzcat(this.SystemPrompt, messagesStruct);
end

[text, message, response] = llms.internal.callOpenAIChatAPI(messagesStruct, this.FunctionsStruct,...
ModelName=this.ModelName, ToolChoice=toolChoice, Temperature=this.Temperature, ...
TopProbabilityMass=this.TopProbabilityMass, NumCompletions=nvp.NumCompletions,...
StopSequences=this.StopSequences, MaxNumTokens=nvp.MaxNumTokens, ...
PresencePenalty=this.PresencePenalty, FrequencyPenalty=this.FrequencyPenalty, ...
ResponseFormat=this.ResponseFormat,Seed=nvp.Seed, ...
ApiKey=this.ApiKey,TimeOut=this.TimeOut, StreamFun=this.StreamFun);
APIKey=this.APIKey,TimeOut=this.TimeOut, StreamFun=this.StreamFun);

if isfield(response.Body.Data,"error")
err = response.Body.Data.error.message;
Expand All @@ -208,7 +208,7 @@ function mustBeValidFunctionCall(this, functionCall)
% if toolChoice is empty
if isempty(toolChoice)
% if Tools is not empty, the default is 'auto'.
if ~isempty(this.Tools)
if ~isempty(this.Tools)
toolChoice = "auto";
end
elseif ~ismember(toolChoice,["auto","none"])
Expand Down Expand Up @@ -240,11 +240,11 @@ function mustBeNonzeroLengthTextScalar(content)

function mustBeValidMsgs(value)
if isa(value, "openAIMessages")
if numel(value.Messages) == 0
if numel(value.Messages) == 0
error("llms:mustHaveMessages", llms.utils.errorMessageCatalog.getMessage("llms:mustHaveMessages"));
end
else
try
try
llms.utils.mustBeNonzeroLengthTextScalar(value);
catch ME
error("llms:mustBeMessagesOrTxt", llms.utils.errorMessageCatalog.getMessage("llms:mustBeMessagesOrTxt"));
Expand Down
Loading

0 comments on commit 1a5ebb9

Please sign in to comment.