Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Cannot pass provider-specific parameters to Bedrock Anthropic models #7782

Closed
mrm1001 opened this issue Jan 15, 2025 · 2 comments · Fixed by #8131
Closed

[Bug]: Cannot pass provider-specific parameters to Bedrock Anthropic models #7782

mrm1001 opened this issue Jan 15, 2025 · 2 comments · Fixed by #8131
Assignees
Labels
bug Something isn't working

Comments

@mrm1001
Copy link

mrm1001 commented Jan 15, 2025

What happened?

Passing the top_k parameter to an Anthropic model in the Bedrock platform throws this error:
BadRequestError: litellm.BadRequestError: BedrockException - {"message":"The model returned the following errors: Malformed input request: #: extraneous key [inferenceConfig] is not permitted, please reformat your input and try again."}

I think the issue is with the Bedrock Converse API, sometimes the additional parameters are passed in this way (from the API docs):

additionalModelRequestFields = {
    "inferenceConfig": {
         "topK": 20
    }
}

model_response = client.converse(
    modelId="us.amazon.nova-lite-v1:0", 
    messages=messages, 
    system=system, 
    inferenceConfig=inf_params,
    additionalModelRequestFields=additionalModelRequestFields
)

Sometimes it's passed without the inferenceConfig key, such as this example in the API docs:

additional_model_fields = {"top_k": top_k}

    # Send the message.
    response = bedrock_client.converse(
        modelId=model_id,
        messages=messages,
        system=system_prompts,
        inferenceConfig=inference_config,
        additionalModelRequestFields=additional_model_fields
    )

I managed to fix the error for Anthropic by changing the litellm code here to this:

      if "topK" in inference_params:
            additional_request_params = {
                "top_k": inference_params.pop("topK")
            }
        elif "top_k" in inference_params:
            additional_request_params = {
                "top_k": inference_params.pop("top_k")
            }

However if you do that, then it stops working for the Nova model. So I'm not sure what the solution is here!

Relevant log output

No response

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

1.58.2

Twitter / LinkedIn details

No response

@mrm1001 mrm1001 added the bug Something isn't working label Jan 15, 2025
@krrishdholakia krrishdholakia self-assigned this Jan 26, 2025
@krrishdholakia
Copy link
Contributor

krrishdholakia commented Jan 26, 2025

However if you do that, then it stops working for the Nova model. So I'm not sure what the solution is here!

Thank you for this ticket @mrm1001 - what's the error thrown by nova? @mrm1001

@ishaan-jaff
Copy link
Contributor

fixed in #8131 by @vibhavbhat

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants