Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: cost for bedrock Cross region inference model isnt mapped #8115

Closed
ramammah opened this issue Jan 30, 2025 · 3 comments · Fixed by #8184
Closed

[Feature]: cost for bedrock Cross region inference model isnt mapped #8115

ramammah opened this issue Jan 30, 2025 · 3 comments · Fixed by #8184
Labels
enhancement New feature or request

Comments

@ramammah
Copy link

The Feature

"This model isn't mapped yet. model=bedrock/us-east-2/us.anthropic.claude-3-haiku-20240307-v1:0, custom_llm_provider=bedrock. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json."

  | `Traceback (most recent call last):\n File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 4277, in _get_model_info_helper\n raise ValueError(\n "This model isn't mapped yet. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json"\n )\nValueError: This model isn't mapped yet. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/litellm_logging.py", line 832, in _response_cost_calculator\n response_cost = litellm.response_cost_calculator(\n **response_cost_calculator_kwargs\n )\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 830, in response_cost_calculator\n raise e\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 818, in response_cost_calculator\n response_cost = completion_cost(\n completion_response=response_object,\n ...<6 lines>...\n prompt=prompt,\n )\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 768, in completion_cost\n raise e\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 747, in completion_cost\n ) = cost_per_token(\n ~~~~~~~~~~~~~~^\n model=model,\n ^^^^^^^^^^^^\n ...<13 lines>...\n audio_transcription_file_duration=audio_transcription_file_duration,\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n )\n ^\n File "/usr/lib/python3.13/site-packages/litellm/cost_calculator.py", line 287, in cost_per_token\n model_info = _cached_get_model_info_helper(\n model=model, custom_llm_provider=custom_llm_provider\n )\n File "/usr/lib/python3.13/site-packages/litellm/caching/_internal_lru_cache.py", line 25, in wrapped\n raise result[1]\n File "/usr/lib/python3.13/site-packages/litellm/caching/_internal_lru_cache.py", line 18, in wrapper\n return ("success", f(*args, **kwargs))\n ~^^^^^^^^^^^^^^^^^\n File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 4155, in _cached_get_model_info_helper\n return _get_model_info_helper(model=model, custom_llm_provider=custom_llm_provider)\n File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 4384, in _get_model_info_helper\n raise Exception(\n ...<3 lines>...\n )\nException: This model isn't mapped yet. model=bedrock/us-east-2/us.anthropic.claude-3-haiku-20240307-v1:0, custom_llm_provider=bedrock. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json.\n` -- | --   | "us.anthropic.claude-3-haiku-20240307-v1:0" -- | --

it should be the same price as the OD model ID

Motivation, pitch

AWS is asking their customers to move toward CRIS

Are you a ML Ops Team?

No

Twitter / LinkedIn details

No response

@ramammah ramammah added the enhancement New feature or request label Jan 30, 2025
@krrishdholakia
Copy link
Contributor

@ramammah i can see the cost mapped -

"us.anthropic.claude-3-haiku-20240307-v1:0": {

How do i repro this error?

cc: @ishaan-jaff for your qa effort

@krrishdholakia
Copy link
Contributor

Unable to repro with this -

resp = litellm.completion(
        model="bedrock/us.anthropic.claude-3-haiku-20240307-v1:0",
        messages=[{"role": "user", "content": "Hello, how are you?"}],
        aws_region_name="us-east-1",
        mock_response="Hello, how are you?",
    )
    assert resp._hidden_params["response_cost"] > 0

@krrishdholakia
Copy link
Contributor

Unable to repro with a direct test and the exact parameters given to get_model_info. Adding this test to our ci/cd, but i suspect you're on an older version @ramammah

@krrishdholakia krrishdholakia closed this as not planned Won't fix, can't repro, duplicate, stale Feb 1, 2025
krrishdholakia added a commit that referenced this issue Feb 3, 2025
* fix(main.py): fix passing openrouter specific params

Fixes #8130

* test(test_get_model_info.py): add check for region name w/ cris model

Resolves #8115
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants