Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for AsyncGenerator event handlers, for streaming long running responses #581

Closed
1 task done
passionate-bram opened this issue Nov 22, 2023 · 2 comments · Fixed by #655
Closed
1 task done
Labels
discussion enhancement New feature or request

Comments

@passionate-bram
Copy link
Contributor

Describe the feature

Provide a new utility that enables developers to write streamed, chunked or partial responses using an async generator:

async function delay<T>(ms, result:T) : Promise<T> {
  return new Promise(resolve => setTimeout(()=>resolve(result), ms));
}

const tasks = [
  () => delay(5_000, 1),
  () => delay(5_000, 2),
  () => delay(5_000, 3),
  () => delay(5_000, 4),
];

export default defineEventHandler(async (event) => {
  return sendYielding(async function*() {
    yield `<h1>Task results</h1><ol>`;
    for (let task of tasks) {
      const result = await task();
      yield `<li>${result}</li>`;
    }
    yield `</ol><p>Job done</p>`;
  });
});

When making this request you would immediately see the <h1>Task results</h1> appear in the response (also in the browser). Then every 5 seconds a new list item would appear as well. Finally, immediately after the 4th list item you also see "Job done".

Added value

The main offering of this feature is a very intuitive way to let developers send results in pieces.
The code a developer writes using the proposed sendYielding is devoid of stream-API / readable-API logic and complications.
Instead the focus lies on the business logic, with a single function body that you can easily await results in and easily send out status updates in between.

sendYielding provides a streaming response function where the developer gives up low-level control of the streaming, and in receives an ergonomic api in return.

What about sendStream?

The sendStream function could already be used to build out the same example logic shown.
But it would be in a very different form, because it would have to build a stream/readable.

For me, personally, trying to read the streaming apis of nodejs is like walking into a brick wall.
The details of stream modes and states are a high barrier to entry.

Not just in having to understand it, but to implement one as well. This mixes the business-logic with implementation details of streams.
It is good that sendStream exists, but it is too big a step when you just want to stream parts of a response without the need for low-level control.

In all likelihood, the sendYielding would be built on top of sendStream.

Use cases

  • Send updates as progress is made on a long-running task
  • Transform a streamed response (I'm not thinking media, more like LLM completion streaming)

Considerations / Open questions

  1. Pick a better name, this one is a bit crude and not entirely representative of what it does.
  2. If the sendYielding wraps sendStream, then expose the AsyncGenerator-to-Readable as a utility as well.
  3. During development attention must be paid that the stream is in some sort of immediate pass-through mode, and does not collect the full response first.
  4. It may need to be a restriction that headers and the like cannot be sent or set from the yielding body.
  5. How should errors be handled before, during and after the generator's lifetime.

Additional information

  • Would you be willing to help implement this feature?
@pi0 pi0 added enhancement New feature or request discussion labels Dec 7, 2023
@pi0
Copy link
Member

pi0 commented Dec 7, 2023

Seems a nice thing to support!

@passionate-bram
Copy link
Contributor Author

I'd like to help out with the PR for this. The main problem I forsee in helping to implement this is that I don't have as wide a range of experience with the ways that h3 (and nitro) are commonly used. Like, the pitfalls to avoid or defend against in the code.

Regardless, I'll get started on getting an initial PR worked out.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
discussion enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants