Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Correct token calculation for streaming in case of using functions #29

Closed
vladisavvv opened this issue Dec 4, 2023 · 3 comments · Fixed by #170
Closed

Correct token calculation for streaming in case of using functions #29

vladisavvv opened this issue Dec 4, 2023 · 3 comments · Fixed by #170
Assignees
Labels
bug Something isn't working

Comments

@vladisavvv
Copy link
Collaborator

Currently we don't calculate tokens for functions and tools

@vladisavvv vladisavvv added the bug Something isn't working label Dec 4, 2023
@adubovik
Copy link
Collaborator

adubovik commented Jul 8, 2024

  1. Consider enabling stream_options.include_usage=True
  2. Keep in mind that to this day we don't how to count tokens for tools/functions presicely

@adubovik
Copy link
Collaborator

Just in case: there is an unofficial function tokenizer, which assumes that OpenAI converts functions to TypeScript definitions: https://github.com/rubenselander/openai-function-tokens

@adubovik
Copy link
Collaborator

adubovik commented Nov 25, 2024

#170 estimates token count of function by tokenizing tools config serialized to a string. It's obviously not an exact match to the OpenAI tokenization, and it may overestimate as well as underestimate the token count, but it's still better than returning 0 completion tokens on a function call.

Use with caution. The recommended way is still using stream_options.include_usage=True that provides the precise token count coming from Azure OpenAI.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

3 participants