Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(Fixes) OpenAI Streaming Token Counting + Fixes usage track when litellm.turn_off_message_logging=True #8156

Merged
merged 10 commits into from
Jan 31, 2025

Conversation

ishaan-jaff
Copy link
Contributor

@ishaan-jaff ishaan-jaff commented Jan 31, 2025

(Fixes) OpenAI Streaming Token Counting + Fixes usage track when litellm.turn_off_message_logging=True

Bug Description

  • For OpenAI streaming the Usage tracked in callbacks was != Usage from OpenAI API

Key changes made

  • For streaming ensure logging callbacks use usage from OpenAI api response
  • When users don't send include_usage to OpenAI API, litellm will still pass include_usage to OpenAI APIs to guarantee accurate token counting
  • Unit testing for both scenarios

Test Cases

  1. test_stream_token_counting_gpt_4o

    • Tests if token usage tracking works correctly when stream_options={"include_usage": True}
    • Verifies that the logging callback records the same usage metrics as returned by OpenAI API
  2. test_stream_token_counting_without_include_usage

    • Tests token usage tracking without explicitly setting include_usage
    • Verifies that the logging callback records the same usage metrics as returned by OpenAI API
  3. test_stream_token_counting_with_redaction

    • Tests token usage tracking when message logging is disabled (litellm.turn_off_message_logging=True)
    • Verifies that the logging callback records the same usage metrics as returned by OpenAI API

Relevant issues

Type

🆕 New Feature
✅ Test

Changes

[REQUIRED] Testing - Attach a screenshot of any new tests passing locally

If UI changes, send a screenshot/GIF of working UI fixes

Copy link

vercel bot commented Jan 31, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jan 31, 2025 10:55pm

@ishaan-jaff ishaan-jaff changed the title Litellm streaming usage approach 2 (Fixes) OpenAI Streaming Token Counting + Fixes usage track when litellm.turn_off_message_logging=True Jan 31, 2025
@ishaan-jaff ishaan-jaff changed the title (Fixes) OpenAI Streaming Token Counting + Fixes usage track when litellm.turn_off_message_logging=True (Fixes) OpenAI Streaming Token Counting + Fixes usage track when litellm.turn_off_message_logging=True Jan 31, 2025
Copy link

codecov bot commented Jan 31, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

📢 Thoughts on this report? Let us know!

@ishaan-jaff ishaan-jaff merged commit 2cf0daa into main Jan 31, 2025
24 of 31 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant