Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: chatting with assistant broke #3874

Open
2 tasks done
gaord opened this issue Sep 15, 2024 · 5 comments
Open
2 tasks done

[Bug]: chatting with assistant broke #3874

gaord opened this issue Sep 15, 2024 · 5 comments
Labels
bug Something isn't working

Comments

@gaord
Copy link

gaord commented Sep 15, 2024

Is there an existing issue for the same bug?

Describe the bug

when chatting with assistant, I always get the following error:

Agent encountered an error while processing the last action.
Error: APIError: litellm.APIError: APIError: OpenAIException - 'str' object has no attribute 'model_dump'
Please try again.

Current OpenHands version

0.9

Installation and Configuration

as in quick start guide:
export WORKSPACE_BASE=$(pwd)/workspace

docker run -it --pull=always \
    -e SANDBOX_RUNTIME_CONTAINER_IMAGE=ghcr.io/all-hands-ai/runtime:0.9-nikolaik \
    -e SANDBOX_USER_ID=$(id -u) \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    --add-host host.docker.internal:host-gateway \
    --name openhands-app-$(date +%Y%m%d%H%M%S) \
    ghcr.io/all-hands-ai/openhands:0.9

Model and Agent

gpt-4 with proxy, codeactagent

Operating System

No response

Reproduction Steps

No response

Logs, Errors, Screenshots, and Additional Context

No response

@gaord gaord added the bug Something isn't working label Sep 15, 2024
@gaord
Copy link
Author

gaord commented Sep 15, 2024

this can go around with base url setting to https://yourhost/v1. As with openai proxy, v1 is needed for base url.

@mamoodi
Copy link
Collaborator

mamoodi commented Sep 15, 2024

Hi gaord! Just want to understand, you are saying if you set the Base URL, it works? Or it still doesn't work?

If you have a proxy setup, the Base URL must be specified.

@gaord
Copy link
Author

gaord commented Sep 18, 2024

it works

@tobitege
Copy link
Collaborator

@gaord do you have by any chance any more logs from the container with that error message?

@mamoodi
Copy link
Collaborator

mamoodi commented Sep 18, 2024

If you are running a proxy, you must set a base URL. See docs:
https://docs.all-hands.dev/modules/usage/llms/openai-llms#using-an-openai-proxy

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants