-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Amazon Bedrock #1
Comments
To add support for Amazon Bedrock, you'll need to follow these steps, tailored to the existing structure of the
By following these steps, you'll integrate Amazon Bedrock into the Referencesgreptileai/zep/config/models.go |
To add support for Amazon Bedrock in the `greptileai/zep` repository, follow these steps:
1. **Update Configuration Model**:
- Modify `/config/models.go` to include Amazon Bedrock specific settings. This involves adding a new struct for Bedrock configuration, similar to how `OpenAI` and `Anthropic` configurations are structured.
2. **Extend LLM Service Enumeration**:
- In `/config/models.go`, extend the `LLM` struct to include a new case for Amazon Bedrock, ensuring it can be selected as an LLM service through the configuration.
3. **Implement Amazon Bedrock Client**:
- Create a new file, perhaps `pkg/llms/bedrock_llm.go`, implementing the `models.ZepLLM` interface for Amazon Bedrock. This includes methods like `Call`, `EmbedTexts`, and `GetTokenCount`, tailored to Amazon Bedrock's API.
4. **Integrate Amazon Bedrock in LLM Client Initialization**:
- In `/pkg/llms/llm_base.go`, extend the `NewLLMClient` function to include a case for initializing the Amazon Bedrock client when the service is selected in the configuration.
5. **Update Configuration File**:
- Add Amazon Bedrock specific settings in `/config.yaml`, such as API keys, endpoints, and any other necessary configuration parameters.
6. **Documentation and Testing**:
- Update the documentation to include instructions on how to configure and use Amazon Bedrock as an LLM service.
- Implement unit tests for the Amazon Bedrock integration to ensure reliability and maintainability.
**Example Code Snippet for Step 3 (Implement Amazon Bedrock Client)**:
```go
package llms
import (
"context"
"github.com/getzep/zep/config"
"github.com/getzep/zep/pkg/models"
// Import Amazon Bedrock SDK
)
type BedrockLLM struct {
// Bedrock client and other necessary fields
}
func NewBedrockLLM(ctx context.Context, cfg *config.Config) (models.ZepLLM, error) {
// Initialize and return a new BedrockLLM client
}
func (b *BedrockLLM) Call(ctx context.Context, prompt string, options ...llms.CallOption) (string, error) {
// Implement calling Amazon Bedrock with the provided prompt
}
func (b *BedrockLLM) EmbedTexts(ctx context.Context, texts []string) ([][]float32, error) {
// Implement text embedding using Amazon Bedrock
}
func (b *BedrockLLM) GetTokenCount(text string) (int, error) {
// Implement token count retrieval for Amazon Bedrock
}
func (b *BedrockLLM) Init(ctx context.Context, cfg *config.Config) error {
// Initialize the Bedrock client with configuration
} Note: Ensure to handle authentication and any specific API rate limiting or error handling as per Amazon Bedrock's documentation.
|
Is your feature request related to a problem? Please describe.
No current support for Amazon Bedrock
Describe the solution you'd like
Support added for Amazon Bedrock
Due to privacy concerns with OpenAI and it's sub products, I'd love to use this with Amazon Bedrock. Direct integration with the go SDK would be awesome but either way, love to see this added.
The text was updated successfully, but these errors were encountered: