You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Support for AWS Bedrock has been provided for PromptModel nodes, it could be usefull to provide it also from the EmbeddingRetriever as AWS BedRock provide not only access to LLM but also Embedding EndPoint (Cohere or AWS Titan), for Haystack v1.x
Describe the solution you'd like
For example in the YAML pipeline file :
We could change the embedding_model string to somethings like aws-titan or aws-cohere and provide AWS credentials as model_kwargs just like the PromptModel
Describe alternatives you've considered
Implementing it myself using a Hacky way that relies on LiteLLM to make call to AWS Bedrock by creating a custom EmbeddingRetriever component. Here is the idea:
Create a _BedRockEmbeddingEncoder that inherits from _OpenAIEmbeddingEncoder and add this to the _EMBEDDING_ENCODERS dict with an override of the embed method
Creating a BedrockEmbedding class that inherits from EmbeddingRetriever
Simply set at initialisation of the class a modified _EMBEDDING_ENCODERS dict that contains the new AWS-BedRock encoders
Additional context
I wanted to use LiteLLM because it provides a unified API to almost ALL LLM providers. I think it's a great backend.
The text was updated successfully, but these errors were encountered:
Hello, @lambda-science... This was done in #6406. If you want to use this feature before the next 1.x release, you should install from the 1.x branch.
Ahah well Sara said on Discord that one colleague was working on support for Bedrock in Haystack 2.0 and that I should check-out his PR. While I was browsing through Bedrock related discussion, I just strumbled one minute ago on this specific PR. Sorry for opening one on something that have already been implmented :) I will install latest and test it <3
Is your feature request related to a problem? Please describe.
Support for AWS Bedrock has been provided for PromptModel nodes, it could be usefull to provide it also from the EmbeddingRetriever as AWS BedRock provide not only access to LLM but also Embedding EndPoint (Cohere or AWS Titan), for Haystack v1.x
Describe the solution you'd like
For example in the YAML pipeline file :
We could change the
embedding_model
string to somethings likeaws-titan
oraws-cohere
and provide AWS credentials asmodel_kwargs
just like thePromptModel
Describe alternatives you've considered
Implementing it myself using a Hacky way that relies on LiteLLM to make call to AWS Bedrock by creating a custom
EmbeddingRetriever
component. Here is the idea:_BedRockEmbeddingEncoder
that inherits from_OpenAIEmbeddingEncoder
and add this to the_EMBEDDING_ENCODERS
dict with an override of theembed
method_EMBEDDING_ENCODERS
dict that contains the new AWS-BedRock encodersAdditional context
I wanted to use LiteLLM because it provides a unified API to almost ALL LLM providers. I think it's a great backend.
The text was updated successfully, but these errors were encountered: