Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increased in the number of token intake after upgrading to version >0.6.0 #667

Open
ANKIT13121999 opened this issue Oct 10, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@ANKIT13121999
Copy link

Describe the bug
I recently upgraded the version of Vanna to v0.7.3 and noticed an increase in the number of tokens being used. For instance, Vanna v0.6 used around ~1400 tokens, but after upgrading to v0.7.3, the token intake increased to ~15000. This is causing failures for models that have a token limit of 8k.
 
To Reproduce
Steps to reproduce the behavior: You just need to upgrade the version to v0.7.3 and check the Vanna logs. The prompt generated is significantly larger compared to the earlier version, v0.6.
 
Expected behavior
The token limits should be in the range of 2k, depending on the complexity of the question I am considering.
 
Error logs/Screenshots
You can see the Vanna logs, but as they contain training documents, I can't post them here.
 
Desktop (please complete the following information where):

  • OS: [e.g. windows]
  • Version: [e.g. 11]
  • Python: [3.11.3]
  • Vanna: [0.7.3]
     
    Additional context
    These are the logs showing the number of tokens used: Using model gpt-35-turbo for 14099.0 tokens (approx). 'error': {'message': "This model's maximum context length is 8192 tokens. However, your messages resulted in 13454 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
@ANKIT13121999 ANKIT13121999 added the bug Something isn't working label Oct 10, 2024
@heloisypr
Copy link

hi friend, can you tell me how you count tokens (in and out) with vanna?

@zainhoda
Copy link
Contributor

@ANKIT13121999 do you know which function caused the issue? Was it vn.generate_summary ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants