Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A question about token limit #1

Open
0xzhouchenyu opened this issue Apr 25, 2023 · 2 comments
Open

A question about token limit #1

0xzhouchenyu opened this issue Apr 25, 2023 · 2 comments
Labels
bug Something isn't working

Comments

@0xzhouchenyu
Copy link

I cloned your repository and replaced the docs with mine. It works fantastic, but sometimes it throws an error like:

2023-04-25 17:20:16.978 error_code=context_length_exceeded error_message="This model's maximum context length is 4097 tokens. However, your messages resulted in 5613 tokens. Please reduce the length of the messages." error_param=messages error_type=invalid_request_error message='OpenAI API error received' stream_error=False
Get response error: This model's maximum context length is 4097 tokens. However, your messages resulted in 5613 tokens. Please reduce the length of the messages.
Response: Connection error. Please try again later.

As I am not proficient with langchain, I would appreciate your help to resolve this issue. Thank you!

@chaosrun chaosrun added the bug Something isn't working label Apr 27, 2023
@chaosrun
Copy link
Member

Hi, I confirmed that it's a bug. Will fix it this week.

@0xzhouchenyu
Copy link
Author

Looking forward to your update!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants