Skip to content

Chat reply length limit #343

Answered by RealRavens
fedorbass asked this question in Q&A
Sep 24, 2023 · 1 comments · 3 replies
Discussion options

You must be logged in to vote

Tokens to prompt and respond use the same pool of tokens. So I guess the response just wasn't enough to make it. One way around it is let's say you're summarizing, if a response is cut off, you could ask the AI to continue and it will in the next response. You could also split your text a little bit, again assuming it's a summary and then give it more text and say continue summary with this text blah blah whatever your prompt is. Hope that helps a bit.

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@RealRavens
Comment options

Answer selected by fedorbass
@fedorbass
Comment options

@RealRavens
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants