-
-
Notifications
You must be signed in to change notification settings - Fork 324
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mechanism for continuing an existing conversation #6
Comments
But how can it maintain track of a specific conversation? There could be a command to start a new chat, then once started future chat commands maintain within that chat until you end the chat - or switch to another one. Chats will make a lot more sense in the web interface, see: Maybe there's a mode where a chat session is an interactive terminal process where you type in more text and hit enter to send a line at a time. |
The logging database schema should take chats into account as well. |
Refs #6 * Add a chat_id to requests * Fail early if the log.db is absent * Automatically add the chat_id column. Co-authored-by: Simon Willison <[email protected]>
Demo of the freshly-merged #14 by @amjith:
It's a bit annoying that |
Idea: Then |
I think |
I broke this - I forgot to set the default model so now 'llm "prompt"` complains that no model was specified. |
I'm having second thoughts about the |
Options for an interactive chat mode:
Maybe the |
I named |
I'm closing this in favour of an issue to redesign the top-level set of commands. |
Refs #6 * Add a chat_id to requests * Fail early if the log.db is absent * Automatically add the chat_id column. Co-authored-by: Simon Willison <[email protected]>
The ChatGPT endpoints only work for chatting if you manually send back your previous questions and responses:
https://til.simonwillison.net/gpt3/chatgpt-api
This tool could help with that, maybe through a
llm chat
command?UPDATE: Or a
-c/--continue
option for continuing the most recent conversation.The text was updated successfully, but these errors were encountered: