-
-
Notifications
You must be signed in to change notification settings - Fork 324
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Set min/max values for float options for OpenAI models #115
Comments
I tried this in a branch: llm/llm/default_plugins/openai_models.py Line 91 in 6bad6de
It has an unfortunate impact on the output of Lines 102 to 118 in 6bad6de
Note the ugly:
I'm not sure how best to clean that up, especially since that output differs between Python versions. In fact https://github.com/simonw/llm/actions/runs/5583462965/jobs/10203805843 failed on Python 3.7 with this differing cog output: temperature: typing_extensions.Annotated[float, None, Interval(gt=None, ge=0, lt=None, le=2), None, None] |
In the meantime it works OK without those min/max settings - the prompt ends up being sent to the OpenAI API which returns a useful error message: llm -m chatgpt 'say hello twice' -o temperature -1
|
No description provided.
The text was updated successfully, but these errors were encountered: