You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
but result show no function info to call. how to fix this problem?
---------------------------------------------------- client code info ---------------------------------------------------------
from openai import OpenAI
import json
tools = [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"format": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The temperature unit to use. Infer this from the users location.",
},
},
"required": ["location", "format"],
},
}
},
{
"type": "function",
"function": {
"name": "get_n_day_weather_forecast",
"description": "Get an N-day weather forecast",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"format": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The temperature unit to use. Infer this from the users location.",
},
"num_days": {
"type": "integer",
"description": "The number of days to forecast",
}
},
"required": ["location", "format", "num_days"]
},
}
},
]
models = client.models.list()
model = models.data[0].id
messages = []
messages.append({"role": "system", "content": "Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous."})
messages.append({"role": "user", "content": "What's the weather like today"})
response = client.chat.completions.create(
model=model,
messages=messages,
tools=tools,
)
assistant_message = response.choices[0].message
messages.append(assistant_message)
print("response: ", assistant_message)
---------------------------------------------- server script-------------------------------------------------
python entrypoints/openai/api_server.py --model="xxxx/Qwen2-1.5B-Instruct" --trust-remote-code --host "localhost" --port 8000 --dtype auto
--------------------------------------------- print info --------------------------------------------------------
response: ChatCompletionMessage(content='Get out and check.', role='assistant', function_call=None, tool_calls=[])
The text was updated successfully, but these errors were encountered:
@FanZhang91 Please format your code with triple backticks.
What GPT model are you using? Your invocation includes --model="xxxx/Qwen2-1.5B-Instruct" which of course isn't an OpenAI model at all.
Please provide a more simple code sample that doesn't use an API server
environment:ubuntu=20.04 + transformers=4.42.4 + openai=1.30.5 + vllm=0.5.2
I use vllm as server and openai as client and using similar code from website:https://github.com/openai/openai-cookbook/blob/main/examples/How_to_call_functions_with_chat_models.ipynb
but result show no function info to call. how to fix this problem?
---------------------------------------------------- client code info ---------------------------------------------------------
from openai import OpenAI
import json
tools = [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"format": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The temperature unit to use. Infer this from the users location.",
},
},
"required": ["location", "format"],
},
}
},
{
"type": "function",
"function": {
"name": "get_n_day_weather_forecast",
"description": "Get an N-day weather forecast",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"format": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The temperature unit to use. Infer this from the users location.",
},
"num_days": {
"type": "integer",
"description": "The number of days to forecast",
}
},
"required": ["location", "format", "num_days"]
},
}
},
]
openai_api_key = "xxx"
openai_api_base = "http://localhost:8000/v1/"
client = OpenAI(
api_key=openai_api_key,
base_url=openai_api_base,
)
models = client.models.list()
model = models.data[0].id
messages = []
messages.append({"role": "system", "content": "Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous."})
messages.append({"role": "user", "content": "What's the weather like today"})
response = client.chat.completions.create(
model=model,
messages=messages,
tools=tools,
)
assistant_message = response.choices[0].message
messages.append(assistant_message)
print("response: ", assistant_message)
---------------------------------------------- server script-------------------------------------------------
python entrypoints/openai/api_server.py --model="xxxx/Qwen2-1.5B-Instruct" --trust-remote-code --host "localhost" --port 8000 --dtype auto
--------------------------------------------- print info --------------------------------------------------------
response: ChatCompletionMessage(content='Get out and check.', role='assistant', function_call=None, tool_calls=[])
The text was updated successfully, but these errors were encountered: