Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added **kwargs for embedding funcs #7664

Conversation

FrancescoSaverioZuppichini

Hi There,

This is a simple PR, adding **kwargs to .embed_query similar to the other OpenAI functions. This is handy because I can add to the request whatever I want, in my case I am running OpenAI behind a custom proxy and I need a couple of more things to the requests

@baskaryan

Thanks

Cheers,

Fra

@vercel
Copy link

vercel bot commented Jul 13, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchain ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jul 13, 2023 3:22pm

@dosubot dosubot bot added the 🤖:improvement Medium size change to existing code to handle new use-cases label Jul 13, 2023
@vercel vercel bot temporarily deployed to Preview July 13, 2023 15:22 Inactive
@@ -461,7 +463,7 @@ async def _aembedding_func(self, text: str, *, engine: str) -> List[float]:
)["data"][0]["embedding"]

def embed_documents(
self, texts: List[str], chunk_size: Optional[int] = 0
self, texts: List[str], chunk_size: Optional[int] = 0, **kwargs
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this signature is defined on base embeddings class, probably don't want to break the interface just here and im not sure we should add it to the base class.

what if added something like this instead? https://github.com/hwchase17/langchain/blob/c7b687e944883df972cabdf00064112587306daf/langchain/llms/openai.py#L137

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The issue is that I need to also pass a parameter model. If I place stuff in model_kwargs (in other places where I can)

llm = AzureOpenAI(deployment_name="text-davinci-003",model_name="text-davinci-003", model_kwargs={ "user" : "JL", "model" : "text-davinci-003" })
__root__
  Parameters {'model'} should be specified explicitly. Instead they were passed in as part of `model_kwargs` parameter. (type=value_error)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for embeddings you can already pass model in directly, no?

OpenAIEmbeddings(model=...)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes but I need to place it also on the request body because that is parsed by our custom proxy 😢

@leo-gan
Copy link
Collaborator

leo-gan commented Sep 19, 2023

@FrancescoSaverioZuppichini Hi , could you, please, resolve the merging issues and address the last comments (if needed)? After that, ping me and I push this PR for the review. Thanks!

@FrancescoSaverioZuppichini
Copy link
Author

@FrancescoSaverioZuppichini Hi , could you, please, resolve the merging issues and address the last comments (if needed)? After that, ping me and I push this PR for the review. Thanks!

sorry, no time

@leo-gan
Copy link
Collaborator

leo-gan commented Oct 2, 2023

@baskaryan I'm closing it.

@leo-gan leo-gan closed this Oct 2, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:improvement Medium size change to existing code to handle new use-cases
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants