Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add Bedrock Embedding Support (v1.x) #6591

Closed
lambda-science opened this issue Dec 19, 2023 · 3 comments
Closed

feat: Add Bedrock Embedding Support (v1.x) #6591

lambda-science opened this issue Dec 19, 2023 · 3 comments

Comments

@lambda-science
Copy link
Contributor

lambda-science commented Dec 19, 2023

Is your feature request related to a problem? Please describe.
Support for AWS Bedrock has been provided for PromptModel nodes, it could be usefull to provide it also from the EmbeddingRetriever as AWS BedRock provide not only access to LLM but also Embedding EndPoint (Cohere or AWS Titan), for Haystack v1.x

Describe the solution you'd like
For example in the YAML pipeline file :

  - name: Retriever
    type: EmbeddingRetriever
    params:
      document_store: DocumentStore
      embedding_model: text-embedding-ada-002
      top_k: 3
      batch_size: 8
      max_seq_len: 1536
      api_key: ${RETRIEVER_PARAMS_API_KEY}

We could change the embedding_model string to somethings like aws-titan or aws-cohere and provide AWS credentials as model_kwargs just like the PromptModel

Describe alternatives you've considered
Implementing it myself using a Hacky way that relies on LiteLLM to make call to AWS Bedrock by creating a custom EmbeddingRetriever component. Here is the idea:

  • Create a _BedRockEmbeddingEncoder that inherits from _OpenAIEmbeddingEncoder and add this to the _EMBEDDING_ENCODERS dict with an override of the embed method
  • Creating a BedrockEmbedding class that inherits from EmbeddingRetriever
  • Simply set at initialisation of the class a modified _EMBEDDING_ENCODERS dict that contains the new AWS-BedRock encoders

Additional context
I wanted to use LiteLLM because it provides a unified API to almost ALL LLM providers. I think it's a great backend.

@anakin87
Copy link
Member

Hello, @lambda-science...
This was done in #6406.
If you want to use this feature before the next 1.x release, you should install from the 1.x branch.

@lambda-science
Copy link
Contributor Author

lambda-science commented Dec 19, 2023

Hello, @lambda-science... This was done in #6406. If you want to use this feature before the next 1.x release, you should install from the 1.x branch.

Ahah well Sara said on Discord that one colleague was working on support for Bedrock in Haystack 2.0 and that I should check-out his PR. While I was browsing through Bedrock related discussion, I just strumbled one minute ago on this specific PR. Sorry for opening one on something that have already been implmented :) I will install latest and test it <3

@anakin87
Copy link
Member

Great!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants