-
Notifications
You must be signed in to change notification settings - Fork 15.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ConversationalRetrievalChain doesn't work along with memory and return_source_documents #9394
Comments
🤖 Hello, Thank you for bringing this issue to our attention. It seems like you're encountering a problem when using The error message you're seeing is due to the fact that the As a workaround, you've found that you can avoid the error by providing an external chat history when querying the chain. This is because when a chat history is provided, the However, this is not an ideal solution as it requires you to manage the chat history externally. We understand that this might not be the most convenient way to use the We have previously encountered a similar issue as reported in Is there no chain for question answer with sources and memory?. The solution suggested there was to use the We acknowledge that this is a limitation of the current design of the Thank you for your understanding and patience. If you have any other questions or issues, please feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Same issue, please help |
Nvm I found the fix, you need to define
|
Great, now it's working. Thanks! |
I tried this with output_key='answer but get the same error
ValueError: One output key expected, got dict_keys(['answer', 'source_documents']) When I try to remove the return_source_documents=True setting, the chain works but does not refer to the chat history when answering the follow up question. Any insights or ETA for when this can be resolved ? |
Hi, @fcalabrow, I'm helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, the issue was initially resolved by defining Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you for your understanding and cooperation. |
Issue you'd like to raise.
I'm trying to use a
ConversationalRetrievalChain
along with aConversationBufferMemory
andreturn_source_documents
set toTrue
. The problem is that, under this setting, I get an error when I call the overall chain.The error message says:
It works if I remove either the memory or the
return_source_documents
parameter.So far, the only workaround that I found out is querying the chain using an external chat history, like this:
chain({"question": query, "chat_history":"dummy chat history"})
Thank you in advance for your help.
The text was updated successfully, but these errors were encountered: