-
Notifications
You must be signed in to change notification settings - Fork 15.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added **kwargs for embedding funcs #7664
Added **kwargs for embedding funcs #7664
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
@@ -461,7 +463,7 @@ async def _aembedding_func(self, text: str, *, engine: str) -> List[float]: | |||
)["data"][0]["embedding"] | |||
|
|||
def embed_documents( | |||
self, texts: List[str], chunk_size: Optional[int] = 0 | |||
self, texts: List[str], chunk_size: Optional[int] = 0, **kwargs |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this signature is defined on base embeddings class, probably don't want to break the interface just here and im not sure we should add it to the base class.
what if added something like this instead? https://github.com/hwchase17/langchain/blob/c7b687e944883df972cabdf00064112587306daf/langchain/llms/openai.py#L137
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The issue is that I need to also pass a parameter model
. If I place stuff in model_kwargs
(in other places where I can)
llm = AzureOpenAI(deployment_name="text-davinci-003",model_name="text-davinci-003", model_kwargs={ "user" : "JL", "model" : "text-davinci-003" })
__root__
Parameters {'model'} should be specified explicitly. Instead they were passed in as part of `model_kwargs` parameter. (type=value_error)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for embeddings you can already pass model in directly, no?
OpenAIEmbeddings(model=...)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes but I need to place it also on the request body because that is parsed by our custom proxy 😢
@FrancescoSaverioZuppichini Hi , could you, please, resolve the merging issues and address the last comments (if needed)? After that, ping me and I push this PR for the review. Thanks! |
sorry, no time |
@baskaryan I'm closing it. |
Hi There,
This is a simple PR, adding **kwargs to
.embed_query
similar to the other OpenAI functions. This is handy because I can add to the request whatever I want, in my case I am running OpenAI behind a custom proxy and I need a couple of more things to the requests@baskaryan
Thanks
Cheers,
Fra