-
-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for response streaming #154
Comments
Might work with Lambda function URLs, but... beside that I don't know if the fastify inject function is able to handle chunked responses in a way to pass them to the awslambda.streamifyResponse 🤷♂️ |
Given the current limitations I don't see this as such a big deal. It will definitely help Vercel. I think it's possible to make a few modifications to inject to support this. However I don't have time to work on them right now. |
thank you for the info |
Does this library support the lambda streaming response yet? That's the only main blocker to use it. |
The Fastify inject function (and maybe also light-my-request), needs to be modified first. |
Any native support for the streaming now? |
Prerequisites
🚀 Feature Proposal
AWS just released the ability to stream a Lambda function's response: https://aws.amazon.com/blogs/compute/introducing-aws-lambda-response-streaming/
It would be great if this framework supported it.
It requires wrapping the handler in
streamifyResponse()
.Motivation
This is an extremely useful feature which greatly decreases TTFB.
Example
The text was updated successfully, but these errors were encountered: