Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure we cancel consumeUint8ArrayReadableStream if iteration breaks #428

Merged
merged 2 commits into from
Jun 26, 2023

Conversation

jridgewell
Copy link
Contributor

Right now, consumeUint8ArrayReadableStream never explicitly closes the the readable stream we're consuming from. Unfortunately that means that devs aren't able to release their resources, leaking them until the GC eventually runs.

This is important for the AI streaming use case, so that devs can detect that a client has disconnected the stream and in turn disconnect the fetch they're maintaining to the AI service.

@changeset-bot
Copy link

changeset-bot bot commented Jun 23, 2023

🦋 Changeset detected

Latest commit: 6ffc30e

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
edge-runtime Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@vercel
Copy link

vercel bot commented Jun 23, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
edge-runtime ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 26, 2023 8:03am

@enricoros
Copy link

@jridgewell: the amount of real-world dollars that this change will save (not transfer to OpenAI but keep in the dev's pockets) cannot be overstated :)

kodiakhq bot pushed a commit to vercel/next.js that referenced this pull request Jun 29, 2023
### What?

The updates `edge-runtime` to the latest version

### Why?

vercel/edge-runtime#428 fixes `consumeUint8ArrayReadableStream` so that when we break iteration early (due to client disconnect), we cleanup the inner stream. That will fire the stream's `cancel` handler, and allow devs to disconnect from an AI service fetch.

### How?

`edge-runtime` maintain a `try {} finally {}` over the inner stream's iteration. When we early break, JS will call `it.return()`, and that will resume `consumeUint8ArrayReadableStream` with an abrupt completion (essentially, the `yield` turns into a `return`). We'll be able to trigger the `finally {}` block with that, and we call `inner.cancel()` to cleanup.

Fixes vercel/ai#90
enricoros added a commit to enricoros/big-AGI that referenced this pull request Jul 5, 2023
Thanks to the Vercel team (@jridgewell), an interruption of the stream on the client
side will lead to the cancellation of the TransformStream on the servers side, which
in turns cancels the open fetch() to the upstream. This was a long needed change and
we are happy to report it works well.

Related: #114
 - vercel/ai#90
 - vercel/edge-runtime#428
 - trpc/trpc#4586 (enormous thanks to the tRPC team to issue
   a quick release as well)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants