Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: migrated to latest anthropic version v0.42 (prompt caching now stable) #352

Merged
merged 1 commit into from
Dec 18, 2024

Conversation

ErikBjare
Copy link
Owner

@ErikBjare ErikBjare commented Dec 18, 2024

Important

Update anthropic library to v0.42 and remove beta prompt caching features in llm_anthropic.py.

  • Library Update:
    • Update anthropic library to version 0.42 in pyproject.toml.
  • Code Changes in llm_anthropic.py:
    • Remove imports related to beta.prompt_caching.
    • Update function calls from _anthropic.beta.prompt_caching.messages.create to _anthropic.messages.create and similar for stream.
    • Update type references from PromptCachingBetaTextBlockParam to anthropic.types.TextBlockParam and similar for other types.

This description was created by Ellipsis for e793808. It will automatically update as commits are pushed.

Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 Looks good to me! Reviewed everything up to e793808 in 23 seconds

More details
  • Looked at 155 lines of code in 2 files
  • Skipped 1 files when reviewing.
  • Skipped posting 2 drafted comments based on config settings.
1. gptme/llm/llm_anthropic.py:53
  • Draft comment:
    The removal of beta.prompt_caching imports and usage is consistent with the update to version 0.42. Ensure that all type references like anthropic.types.TextBlockParam are correctly defined in the new version.
  • Reason this comment was not posted:
    Confidence changes required: 50%
    The PR updates the Anthropic library version and removes references to the beta prompt caching. The changes seem consistent with the update, but there is a potential issue with the use of type hints and imports.
2. gptme/llm/llm_anthropic.py:350
  • Draft comment:
    The use of # type: ignore suggests that there might be a type mismatch or an issue with type inference. Consider revisiting the type definitions to ensure they align with the expected types from the Anthropic API.
  • Reason this comment was not posted:
    Confidence changes required: 50%
    The PR updates the Anthropic library version and removes references to the beta prompt caching. The changes seem consistent with the update, but there is a potential issue with the use of type hints and imports.

Workflow ID: wflow_tmgRMv67wJHYbYhe


You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet mode, and more.

@codecov-commenter
Copy link

codecov-commenter commented Dec 18, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 72.33%. Comparing base (b4d0c41) to head (e793808).

✅ All tests successful. No failed tests found.

Additional details and impacted files
@@            Coverage Diff             @@
##           master     #352      +/-   ##
==========================================
+ Coverage   69.14%   72.33%   +3.18%     
==========================================
  Files          67       67              
  Lines        5133     5132       -1     
==========================================
+ Hits         3549     3712     +163     
+ Misses       1584     1420     -164     
Flag Coverage Δ
anthropic/claude-3-haiku-20240307 70.53% <100.00%> (?)
openai/gpt-4o-mini 69.15% <0.00%> (+0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@ErikBjare ErikBjare merged commit 0ecf045 into master Dec 18, 2024
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants