-
Notifications
You must be signed in to change notification settings - Fork 16.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SQLDatabaseToolkit adds temperature=0 by default and breaks the execution when using 03-mini #29541
Comments
As openai reasoning models does not support the temperature parameter i have update the ChatOPENAI api to not to consider the temperature parameter in default parameters to avoid |
Thank you for looking into this. |
@aifa i have go through the create_react_agent api i'll let you know soon what's the reason. Let me know if you are facing the issue. can you please provide the code snippet and how do you call the create_react_agent and its response ? |
Hi @rawathemant246 , It seems create_react_agent did not work because I was not using the latest version of langchain_openai. I updated the package and the error stopped occurring. |
@aifa I also checks how to use create_react_agent you can use the latest method now
|
Checked other resources
Example Code
In langchain_community/agent_toolkits/sql/toolkit.py, class SQLDatabaseToolkit(BaseToolkit): line 48:
Instantiate:
.. code-block:: python
This instructs the LLM to generate code using temperature=0. When o3-mini is used ,then this causes:
BadRequestError('Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.",
I removed the temperature=0 from my local copy of the code and instruct it to not use temperature with o3-mini and the error stopped.
It also works when I pass the instruction not to set temperature in the user/system prompt directly.
Error Message and Stack Trace (if applicable)
BadRequestError('Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}}')Traceback (most recent call last):
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 637, in generate
self._generate_with_cache(
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 855, in _generate_with_cache
result = self._generate(
^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 717, in _generate
response = self.client.create(**payload)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/openai/_utils/_utils.py", line 279, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 850, in create
return self._post(
^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1283, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 960, in request
return self._request(
^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1064, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}}
Description
In langchain_community/agent_toolkits/sql/toolkit.py, class SQLDatabaseToolkit(BaseToolkit): line 48:
Instantiate:
.. code-block:: python
This instructs the LLM to generate code using temperature=0. When o3-mini is used ,then this causes:
BadRequestError('Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.",
I removed the temperature=0 from my local copy of the code and instruct it to not use temperature with o3-mini and the error stopped.
It also works when I pass the instruction not to set temperature in the user/system prompt directly.
System Info
System Information
Package Information
Other Dependencies
The text was updated successfully, but these errors were encountered: