-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature]: cost for bedrock Cross region inference model isnt mapped #8115
Labels
enhancement
New feature or request
Comments
@ramammah i can see the cost mapped - litellm/model_prices_and_context_window.json Line 5857 in a713d7d
How do i repro this error? cc: @ishaan-jaff for your qa effort |
Unable to repro with this - resp = litellm.completion(
model="bedrock/us.anthropic.claude-3-haiku-20240307-v1:0",
messages=[{"role": "user", "content": "Hello, how are you?"}],
aws_region_name="us-east-1",
mock_response="Hello, how are you?",
)
assert resp._hidden_params["response_cost"] > 0 |
krrishdholakia
added a commit
that referenced
this issue
Feb 1, 2025
Unable to repro with a direct test and the exact parameters given to |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The Feature
"This model isn't mapped yet. model=bedrock/us-east-2/us.anthropic.claude-3-haiku-20240307-v1:0, custom_llm_provider=bedrock. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json."
| `Traceback (most recent call last):\n File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 4277, in _get_model_info_helper\n raise ValueError(\n "This model isn't mapped yet. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json"\n )\nValueError: This model isn't mapped yet. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/litellm_logging.py", line 832, in _response_cost_calculator\n response_cost = litellm.response_cost_calculator(\n **response_cost_calculator_kwargs\n )\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 830, in response_cost_calculator\n raise e\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 818, in response_cost_calculator\n response_cost = completion_cost(\n completion_response=response_object,\n ...<6 lines>...\n prompt=prompt,\n )\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 768, in completion_cost\n raise e\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 747, in completion_cost\n ) = cost_per_token(\n ~~~~~~~~~~~~~~^\n model=model,\n ^^^^^^^^^^^^\n ...<13 lines>...\n audio_transcription_file_duration=audio_transcription_file_duration,\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 287, in cost_per_token\n model_info = _cached_get_model_info_helper(\n model=model, custom_llm_provider=custom_llm_provider\n )\n File "/usr/lib/python3.13/site-packages/litellm/caching/_internal_lru_cache.py", line 25, in wrapped\n raise result[1]\n File "/usr/lib/python3.13/site-packages/litellm/caching/_internal_lru_cache.py", line 18, in wrapper\n return ("success", f(*args, **kwargs))\n ~^^^^^^^^^^^^^^^^^\n File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 4155, in _cached_get_model_info_helper\n return _get_model_info_helper(model=model, custom_llm_provider=custom_llm_provider)\n File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 4384, in _get_model_info_helper\n raise Exception(\n ...<3 lines>...\n )\nException: This model isn't mapped yet. model=bedrock/us-east-2/us.anthropic.claude-3-haiku-20240307-v1:0, custom_llm_provider=bedrock. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json.\n` -- | -- | "us.anthropic.claude-3-haiku-20240307-v1:0" -- | --it should be the same price as the OD model ID
Motivation, pitch
AWS is asking their customers to move toward CRIS
Are you a ML Ops Team?
No
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: