Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prompt Management - support router + optional params #7594

Merged
merged 8 commits into from
Jan 7, 2025

Conversation

krrishdholakia
Copy link
Contributor

@krrishdholakia krrishdholakia commented Jan 7, 2025

  • fix(custom_logger.py): expose new 'async_get_chat_completion_prompt' event hook

  • fix(custom_logger.py): langfuse_prompt_management.py

remove 'headers' from custom logger 'async_get_chat_completion_prompt' and 'get_chat_completion_prompt' event hooks

  • feat(router.py): expose new function for prompt management based routing

  • feat(router.py): partial working router prompt factory logic

allows load balanced model to be used for model name w/ langfuse prompt management call

  • feat(router.py): fix prompt management with load balanced model group

  • feat(langfuse_prompt_management.py): support reading in openai params from langfuse

enables user to define optional params on langfuse vs. client code

  • test(test_Router.py): add unit test for router based langfuse prompt management

  • fix: fix linting errors

Copy link

vercel bot commented Jan 7, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jan 7, 2025 4:12am

… from langfuse

enables user to define optional params on langfuse vs. client code
Copy link

codecov bot commented Jan 7, 2025

Codecov Report

Attention: Patch coverage is 56.89655% with 25 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
litellm/router.py 41.02% 23 Missing ⚠️
litellm/integrations/custom_logger.py 50.00% 1 Missing ⚠️
...ntegrations/langfuse/langfuse_prompt_management.py 90.90% 1 Missing ⚠️

📢 Thoughts on this report? Let us know!

@krrishdholakia krrishdholakia merged commit fef7839 into main Jan 7, 2025
30 of 31 checks passed
@krrishdholakia krrishdholakia changed the title Litellm dev 01 06 2025 p1 Prompt Management - support router + optional params Jan 7, 2025
rajatvig pushed a commit to rajatvig/litellm that referenced this pull request Jan 16, 2025
* fix(custom_logger.py): expose new 'async_get_chat_completion_prompt' event hook

* fix(custom_logger.py): langfuse_prompt_management.py

remove 'headers' from custom logger 'async_get_chat_completion_prompt' and 'get_chat_completion_prompt' event hooks

* feat(router.py): expose new function for prompt management based routing

* feat(router.py): partial working router prompt factory logic

allows load balanced model to be used for model name w/ langfuse prompt management call

* feat(router.py): fix prompt management with load balanced model group

* feat(langfuse_prompt_management.py): support reading in openai params from langfuse

enables user to define optional params on langfuse vs. client code

* test(test_Router.py): add unit test for router based langfuse prompt management

* fix: fix linting errors
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant