-
Notifications
You must be signed in to change notification settings - Fork 16.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Not using Tools in Langchain_Ollama causes: 'NoneType' object is not iterable #28312
Comments
we had another issue that was similar to this one, maybe try |
Yes, that is how I would also fix the bug in the langchain implementation |
did it work? |
@bauerem please upgrade to latest version of |
Same error here: Package Information
Code example
Error
|
@keenborder786 Latest version doesn't seem to have the None check. |
@nourishnew yeap you are correct I have fixed it it in the PR. Hopefully in the next release it will be fixed. |
Checked other resources
Example Code
from langchain_ollama import ChatOllama
llm = ChatOllama(
model="llama3.1",
temperature=0,
disable_streaming=True
)
messages = [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human", "I love programming."),
]
ai_msg = llm.invoke(messages)
print(ai_msg)
Error Message and Stack Trace (if applicable)
Traceback (most recent call last):
File "/home/user1/Repo/ollama-cloud/use4.py", line 16, in
ai_msg = llm.invoke(messages)
File "/home/user1/Repo/ollama-cloud/venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 286, in invoke
self.generate_prompt(
File "/home/user1/Repo/ollama-cloud/venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 786, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
File "/home/user1/Repo/ollama-cloud/venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 643, in generate
raise e
File "/home/user1/Repo/ollama-cloud/venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 633, in generate
self._generate_with_cache(
File "/home/user1/Repo/ollama-cloud/venv/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 851, in _generate_with_cache
result = self._generate(
File "/home/user1/Repo/ollama-cloud/venv/lib/python3.10/site-packages/langchain_ollama/chat_models.py", line 648, in _generate
final_chunk = self._chat_stream_with_aggregation(
File "/home/user1/Repo/ollama-cloud/venv/lib/python3.10/site-packages/langchain_ollama/chat_models.py", line 560, in _chat_stream_with_aggregation
tool_calls=_get_tool_calls_from_response(stream_resp),
File "/home/user1/Repo/ollama-cloud/venv/lib/python3.10/site-packages/langchain_ollama/chat_models.py", line 71, in _get_tool_calls_from_response
for tc in response["message"]["tool_calls"]:
TypeError: 'NoneType' object is not iterable
Description
Trying to simple get a chat response from ChatOllama.
System Info
System Information
Package Information
Optional packages not installed
Other Dependencies
The text was updated successfully, but these errors were encountered: