Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stop Generation button initial implementation. #22

Merged
merged 1 commit into from
Mar 26, 2023

Conversation

fredliubojin
Copy link
Collaborator

Implemented the Stop Generation button functionality

  • Added a "Stop Generation" button below the "Chat" button
  • "Stop Generation" button will trigger the abort signal for the ongoing request
  • ongoing request will be aborted accordingly thus, the text generation is stopped.
  • "Stop Generation" button is only active when there's ongoing response, which is also the same time "Chat" button is disabled.
  • Due to the implementation mechanism being the request being aborted from client side (browser), node's http server will display the 'ECONNRESET' error due to client aborting the request. However, to suppress this error message requires creating a custom server, which is not recommended for most use cases and can lead to loss of some Next.js features. Therefore, we do not suppress this message in the first iteration.

@vercel
Copy link

vercel bot commented Mar 23, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated
nextjs-chatgpt-app ✅ Ready (Inspect) Visit Preview 💬 Add your feedback Mar 23, 2023 at 7:46AM (UTC)

@enricoros
Copy link
Owner

enricoros commented Mar 23, 2023

I've tested this locally, and the API (chat.ts) is still continuing to receive tokens after the generation. Shall I merge?

To test it (in chat.ts):

        // https://web.dev/streams/#asynchronous-iteration
        for await (const chunk of res.body as any) {
          **console.log('chunk', decoder.decode(chunk));** // <- this will continue to receive tokens from gpt-4 after the abort
          parser.feed(decoder.decode(chunk));
        }

@enricoros enricoros merged commit 5008f11 into enricoros:main Mar 26, 2023
enricoros added a commit that referenced this pull request Jul 5, 2023
Thanks to the Vercel team (@jridgewell), an interruption of the stream on the client
side will lead to the cancellation of the TransformStream on the servers side, which
in turns cancels the open fetch() to the upstream. This was a long needed change and
we are happy to report it works well.

Related: #114
 - vercel/ai#90
 - vercel/edge-runtime#428
 - trpc/trpc#4586 (enormous thanks to the tRPC team to issue
   a quick release as well)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants