You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Passing the top_k parameter to an Anthropic model in the Bedrock platform throws this error: BadRequestError: litellm.BadRequestError: BedrockException - {"message":"The model returned the following errors: Malformed input request: #: extraneous key [inferenceConfig] is not permitted, please reformat your input and try again."}
I think the issue is with the Bedrock Converse API, sometimes the additional parameters are passed in this way (from the API docs):
What happened?
Passing the
top_k
parameter to an Anthropic model in the Bedrock platform throws this error:BadRequestError: litellm.BadRequestError: BedrockException - {"message":"The model returned the following errors: Malformed input request: #: extraneous key [inferenceConfig] is not permitted, please reformat your input and try again."}
I think the issue is with the Bedrock Converse API, sometimes the additional parameters are passed in this way (from the API docs):
Sometimes it's passed without the
inferenceConfig
key, such as this example in the API docs:I managed to fix the error for Anthropic by changing the litellm code here to this:
However if you do that, then it stops working for the Nova model. So I'm not sure what the solution is here!
Relevant log output
No response
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
1.58.2
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: