Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support checking provider-specific /models endpoints for available models based on key #7538

Merged
merged 10 commits into from
Jan 4, 2025

Conversation

krrishdholakia
Copy link
Contributor

@krrishdholakia krrishdholakia commented Jan 3, 2025

Title

Updates usage of get_valid_models() to get the available models for a provider based on the env key

from litellm.utils import get_valid_models

valid_models = get_valid_models(check_provider_endpoint=True)
  • support 'WATSONX_ZENAPIKEY' for iam auth
  • feat(fireworks_ai/transformation.py): support retrieving valid models from fireworks ai endpoint on /model/info

Relevant issues

Fixes #7525

Type

🆕 New Feature
🐛 Bug Fix
🧹 Refactoring
📖 Documentation
🚄 Infrastructure
✅ Test

Changes

[REQUIRED] Testing - Attach a screenshot of any new tests passing locally

If UI changes, send a screenshot/GIF of working UI fixes

Copy link

vercel bot commented Jan 3, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jan 4, 2025 3:14am

@krrishdholakia krrishdholakia merged commit f770dd0 into main Jan 4, 2025
27 of 29 checks passed
@krrishdholakia krrishdholakia deleted the litellm_dev_01_03_2025_p2 branch January 4, 2025 03:30
rajatvig pushed a commit to rajatvig/litellm that referenced this pull request Jan 16, 2025
…models based on key (BerriAI#7538)

* test(test_utils.py): initial test for valid models

Addresses BerriAI#7525

* fix: test

* feat(fireworks_ai/transformation.py): support retrieving valid models from fireworks ai endpoint

* refactor(fireworks_ai/): support checking model info on `/v1/models` route

* docs(set_keys.md): update docs to clarify check llm provider api usage

* fix(watsonx/common_utils.py): support 'WATSONX_ZENAPIKEY' for iam auth

* fix(watsonx): read in watsonx token from env var

* fix: fix linting errors

* fix(utils.py): fix provider config check

* style: cleanup unused imports
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Feature]: use litellm python SDK to validate models on proxy config.yaml
1 participant