-
Notifications
You must be signed in to change notification settings - Fork 85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for OpenAI's GPT-4 turbo (gpt-4-1106-preview) model + GPT-4o + GPT-4o mini #4
Comments
A quick search of the repo suggest changes would need to be made to at least the following files: |
Yes, great point! I think it would make sense to even parametrize the model as a command line argument 🤔 |
Yeah, I was thinking that too. Though maybe it can still have some 'friendly aliases' built in or similar so that the end user doesn't need to know the exact model name they need (specifically thinking about the current preview version of There are API's for querying the models available too, so if you wanted to get really fancy and not hardcode things, the CLI could potentially fetch the available models from that URL and cache them, then tell the user which ones could be used. Though that might be overkill. |
GPT-4o mini was announced today; ~60% cheaper than GPT-3.5 Turbo:
|
|
Since the v2 CLI's OpenAI feature seems to allow the model name to be specified via humanify/src/commands/openai.ts Lines 9 to 12 in a6b0999
See also: |
Since OpenAI released the GPT-4 Turbo model (
gpt-4-1106-preview
) and reduced the prices of GPT-4 at their recent dev day, it would be cool if this tool was able to support using those as well.Further Reading
See Also
The text was updated successfully, but these errors were encountered: