Chat reply length limit #343
-
Is there a maximum character limit for the response? I asked for a long note to be rewritten, but the reply was cut off at the 1,327th character. Using a 16k model in settings. |
Beta Was this translation helpful? Give feedback.
Answered by
RealRavens
Sep 28, 2023
Replies: 1 comment 3 replies
-
Found "max_available_tokens: 200" in the developer console. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Tokens to prompt and respond use the same pool of tokens. So I guess the response just wasn't enough to make it. One way around it is let's say you're summarizing, if a response is cut off, you could ask the AI to continue and it will in the next response. You could also split your text a little bit, again assuming it's a summary and then give it more text and say continue summary with this text blah blah whatever your prompt is. Hope that helps a bit.