-
Notifications
You must be signed in to change notification settings - Fork 16.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ChatOllama stream method raises warn_deprecated NotImplementedError #14980
Comments
🤖 It's great to see that you've identified a potential issue and a possible solution related to the If you're interested in contributing to LangChain, I would encourage you to go ahead and create a pull request with your proposed fix. The maintainers will review your changes and determine if it aligns with the project's goals. Thank you for your interest in improving LangChain! This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
I'm getting the same issue in my app. Chainlit offers a generic langchain provider that streams the model backend request provided by an llm input. When I add support for updating slider inputs in a playground request like so: result = await make_async(self.llm.stream)(
input=messages, # User Prompt
**request.prompt.settings # User Inputs like temperature, top_K, etc.
) I get the error above. When I downgrade to v0.0.350 I also receive the same error, so I am unsure if the issue is in the specific PR you mentioned. The out-of-the-box Edit: On followup testing, it appears that v0.0.349 is the release that is working for me. langchain_ollama_stream_error.mp4 |
The issue still persists in the latest version. Please provide guidance on how to handle streaming in ChatOllama. @v-byte-cpu do you have any updates? |
Issue was fixed in #15104 |
I'm facing this issue for GooglePalm in the latest version |
System Info
langchain version: v0.0.352
python version: 3.11
Hi there! After that PR #14713 was merged, I started getting errors in stream() method:
I guess this decorator must have a
pending=True
argument.Who can help?
@hwchase17 @agola11
Information
Related Components
Reproduction
Expected behavior
successful streaming output from llm
The text was updated successfully, but these errors were encountered: