What is the maximum amount of text that can be used when analyzing a document in ChatGPT? #330
-
What is the maximum amount of text that can be used when analyzing a document in ChatGPT? I've tried asking a few questions in a document with 800,000 characters, but I keep getting errors "API Error. See console logs for details.". It works well for three or four pages. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Metric isn't really pages, because it depends how much is on the page. So to sort of explain, ChatGPT/OpenAI, provides models that also have a certain amount of "tokens" you can use maximum with each. The tokens are used up by the prompt and the text you send, as well as the response that comes back from the model. So lets say 80000 characters is roughly 13000 to 27000 words. And 1500 words ~=2048 tokens (https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-them). The document you're trying to prompt over is already taking up about: (on the low end) 13000/1500 = 8.6 lets say 9 * 2048 = 18432 tokens. So the text by its self would already be too much (Largest we got is 16k tokens in smart connections). That's not even including the prompt or the response which would require more tokens. I'd recommend breaking down and prompting over the document in pieces. Be mindful that context in the chat may be used as a token. So if you have a huge chat and you're wondering why with your small prompts nothing is going through? Your chat context is over the limit of the tokens. Hope this helps. |
Beta Was this translation helpful? Give feedback.
Metric isn't really pages, because it depends how much is on the page. So to sort of explain, ChatGPT/OpenAI, provides models that also have a certain amount of "tokens" you can use maximum with each. The tokens are used up by the prompt and the text you send, as well as the response that comes back from the model.
So lets say 80000 characters is roughly 13000 to 27000 words. And 1500 words ~=2048 tokens (https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-them).
The document you're trying to prompt over is already taking up about: (on the low end) 13000/1500 = 8.6 lets say 9 * 2048 = 18432 tokens. So the text by its self would already be too much (Largest we got …