-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(construct): bedrock batch step functions fragment #928
Conversation
This PR introduces a new L3 construct BedrockBatchSfn that simplifies running batch inference jobs with Amazon Bedrock using AWS Step Functions. The construct creates a state machine fragment that: - Processes multiple input manifests in parallel using Step Functions Map state - Creates and monitors Bedrock batch inference jobs asynchronously - Handles job completion to continue state machine execution without polling. - Manages IAM roles and permissions for secure execution - Provides configurable timeout settings (24h-168h range) This construct helps developers implement batch inference workflows without managing complex infrastructure code, while following AWS best practices for security and scalability.
The Code Expert sample performs expert code reviews using an LLM to evaluate code repositories against custom rules defined in natural language that are difficult to evaluate using existing static analysis tools. It can evaluate simple rules that analyze files in isolation, and context rules that consider broader repository content when performing evaluations. This sample leverages Bedrock batch inference for scalability and cost savings. It is an example usage of the [BedrockBatchSfn construct](awslabs/generative-ai-cdk-constructs#928).
The Code Expert sample performs expert code reviews using an LLM to evaluate code repositories against custom rules defined in natural language that are difficult to evaluate using existing static analysis tools. It can evaluate simple rules that analyze files in isolation, and context rules that consider broader repository content when performing evaluations. This sample leverages Bedrock batch inference for scalability and cost savings. It is an example usage of the [BedrockBatchSfn construct](awslabs/generative-ai-cdk-constructs#928).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Jeff. LGTM, small suggestions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you Jeff for your contribution.
Nice job! Thank you! |
This pull request has been removed from the queue for the following reason: Pull request #928 has been dequeued. The pull request rule doesn't match anymore You should look at the reason for the failure and decide if the pull request needs to be fixed or if you want to requeue it. If you want to requeue this pull request, you need to post a comment with the text: |
This pull request has been removed from the queue for the following reason: Pull request #928 has been dequeued. The pull request could not be merged. This could be related to an activated branch protection or ruleset rule that prevents us from merging. (detail: You're not authorized to push to this branch. Visit https://docs.github.com/repositories/configuring-branches-and-merges-in-your-repository/managing-protected-branches/about-protected-branches for more information.) You should look at the reason for the failure and decide if the pull request needs to be fixed or if you want to requeue it. If you want to requeue this pull request, you need to post a comment with the text: |
This PR introduces a new L3 construct BedrockBatchSfn that simplifies running batch inference jobs with Amazon Bedrock using AWS Step Functions. The construct creates a state machine fragment that:
This construct helps developers implement batch inference workflows without managing complex infrastructure code, while following AWS best practices for security and scalability.
This construct was tested in a project that will be submitted to the samples repo next week.
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of the project license.