-
Notifications
You must be signed in to change notification settings - Fork 599
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Amazon Bedrock support #483
Conversation
@@ -60,6 +62,12 @@ def __init__(self): | |||
litellm.vertex_location = get_settings().get( | |||
"VERTEXAI.VERTEX_LOCATION", None | |||
) | |||
if get_settings().get("AWS.BEDROCK_REGION", None): | |||
litellm.AmazonAnthropicConfig.max_tokens_to_sample = 2000 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need this because max_tokens_to_sample
in litellm is 256 by default and that will stop the completion in the middle of a review. We can allow users to set this parameter from a config file but I think hardcoding this large value is usually enough for many users.
} | ||
if self.aws_bedrock_client: | ||
kwargs["aws_bedrock_client"] = self.aws_bedrock_client | ||
response = await acompletion(**kwargs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
adding aws_bedrock_client
kwarg when calling other providers like openai will fail due to the below error, so I only add this arg when using bedrock.
openai.error.InvalidRequestError: Unrecognized request argument supplied: aws_bedrock_client
pr_agent/algo/utils.py
Outdated
@@ -290,6 +290,7 @@ def _fix_key_value(key: str, value: str): | |||
|
|||
def load_yaml(response_text: str) -> dict: | |||
response_text = response_text.removeprefix('```yaml').rstrip('`') | |||
response_text = response_text.strip().rstrip().removeprefix('{').removesuffix('}') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Claude tends to add a leading {
and a trailing }
in the generated yaml, so I'm removing these characters here. try_fix_yaml
didn't work for some reason.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is problematic.
what if the message actually starts with '{' ?
Its not something that should come in any way with a yaml, so removing it just like that is risky
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Move this line, as a seperate fallback, to 'try_fix_yaml' section.
but this is not good for claude in terms of claude-vs-openai comparison :-)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi thanks. I moved them to try_fix_yaml
. Yea I think I'll have to optimize prompts for Claude :)
@@ -14,7 +14,7 @@ GitPython==3.1.32 | |||
PyYAML==6.0 | |||
starlette-context==0.3.6 | |||
litellm==0.12.5 | |||
boto3==1.28.25 | |||
boto3==1.33.1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
litellm requires boto3>=1.28.57
, so I updated boto3 to the latest version. I don't see any breaking changes between 1.28.25 and 1.33.1 (doc).
@@ -18,4 +18,7 @@ | |||
'vertex_ai/codechat-bison-32k': 32000, | |||
'codechat-bison': 6144, | |||
'codechat-bison-32k': 32000, | |||
'anthropic.claude-v2': 100000, | |||
'anthropic.claude-instant-v1': 100000, | |||
'anthropic.claude-v1': 100000, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I add support only for Claude for now because it's difficult to tune more parameters like max_tokens_to_sample
specific for each model.
thanks @tmokmss |
Add Amazon Bedrock support
closes #459
Added support for Anthropic Claude on Amazon Bedrock. I'll add some explanation in the review comments.