-
Notifications
You must be signed in to change notification settings - Fork 445
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Out of proc languages stream support #1361
Comments
Related: #1319 |
This will also alow realistic static web serving of file based resources like HTML, css images etc. |
I'd love to see this. |
node without streams? really? |
Agree this should be a v high priority for js/ts to have 1st class support |
@goofballLogic Yep :-). Based on my interactions it seems like there isn't a very large base of people using NodeJS with Azure Functions (Its at least seems to always be the same 5-6 names on any GitHub ticket). Given the small number of people, the product group is doing an amazing job working through issues annyway. Its gotten a TON better in the last ~year. With that said, some silly things like this seem to take time. If I had noticed https://github.com/lambci/lambci earlier I probably would have stuck with AWS Lambda. The big thing I was missing on the Lambda side was CI/CD and I didn't want to pay for another service to get it. However, at this point things work well enough, and especially with the larger file support that Azure Functions handles vs. Lambda it makes several things I'm doing a lot easier. Add in the streaming support, however; and the larger file support would be an even bigger reason to choose Azure Functions. Azure Functions has the potential to be an amazing offering for NodeJS CI/CD function development. Its getting there, but isn't quite 100% there yet. |
Yeah we're evaluating AWS vs Azure for a hybrid C# / node.js serverless implementation. Had assumed that Azure would be the way to go, but I think it's not quite up to the job yet : (
Sent from TypeApp
…On 10 Aug 2017, 13:39, at 13:39, Doug Logan ***@***.***> wrote:
@goofballLogic Yep :-). Based on my interactions it seems like there
isn't a very large base of people using NodeJS with Azure Functions
(Its at least seems to always be the same 5-6 names on any GitHub
ticket). Given the small number of people, the product group is doing
an amazing job working through issues annyway. Its gotten a TON better
in the last ~year. With that said, some silly things like this seem to
take time.
If I had noticed https://github.com/lambci/lambci earlier I probably
would have stuck with AWS Lambda. The big thing I was missing on the
Lambda side was CI/CD and I didn't want to pay for another service to
get it. However, at this point things work well enough, and especially
with the larger file support that Azure Functions handles vs. Lambda it
makes several things I'm doing a lot easier.
Add in the streaming support, however; and the larger file support
would be an even bigger reason to choose Azure Functions.
Azure Functions has the potential to be an amazing offering for NodeJS
CI/CD function development. Its getting there, but isn't quite 100%
there yet.
--
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
#1361 (comment)
|
@goofballLogic It all depends what you're doing. If you're dealing with small files, then Lambda is probably faster / more consistent, etc. However, take a close look at these Lambda limits: One of our projects was uploading/processing a ~6.5MB XML file. Lambda couldn't handle that. I'd say that's even more core than streaming. This is kind of what happens when you're bleeding edge I guess. I would say in general Lambda still seems more polished than Azure Functions, but that gap is quickly closing and even Lambda has its stupid stuff (e.g. File Limits). Now there are ways around this in Lambda. You can write to S3 instead of the Lambda function, then retrieve the file from the Lambda Function and process it. However, this is a LOT more code than just handling the request. With both platforms being pretty horrible to debug once deployed, more code = more trouble. As a result with any Servlerless stack it is definitely a place where the KISS principal applies. |
@goofballLogic I should mention, things got a lot faster and more reliable with NodeJS on Azure when I WebPacked things into their own orphaned git branch (See https://github.com/securityvoid/pack-git ). If you play with Azure much you should check it out. |
I decided not to clutter up my main repo with build artifacts. I ws also thinking that every function will have a full coppy of all it's deps in it's bundle file. The storage is limited, but at least we don't pay for it ;) |
There is also https://github.com/Azure/azure-functions-pack |
Is there any update on this request? Would love to be able to stream files |
This lack of support kinda makes the blob storage service unfeasible for node projects, would love to see this implemented. |
Can someone clarify - there is still no way to serve a binary file from Azure Function using Node? |
Would really like this functionality in the node stack. Having to pull a blob into memory seems like a big miss. |
Is there any update on this? |
Update? |
any update on this? I am trying to achieve the exact same thing to stream the the Blob. |
any update? |
Just came across this as well while trying to support Remix on Azure. Either Azure Static Apps needs to support fallback Function calls or Functions need to support streaming. |
Is there any update regarding this? This issue seems to have been on the backlog for 4 years, are there any plans to support streams for other languages than C#? (EDIT: I made this comment as an individual developer, I'm now on the Azure Static Web Apps team and am advocating for this with the Azure Functions team) |
I agree this needs to be adressed immediately. I am deep into a java project with azure functions and am only now finding out core functionality like this is not supported |
Any update? This is just another fundamental feature missing from Azure Functions.... |
I've been watching this for soooo long. Was excited to see a github notification on this topic. alas... another bump. Any word from dev team or product managers? We can't even do short lived SSE responses. FWIW cloudflare workers support this. If this ever makes it to production, please also support it in azure static web app functions without needing BYO functions. |
This is now supported in AWS Lambda: https://aws.amazon.com/blogs/compute/introducing-aws-lambda-response-streaming/ Hope Azure catches up. |
(EDIT: PM of Azure Static Web Apps here) @sig9 very aware of the need for this in Azure Static Web Apps and hopefully we can draw attention to this so we can enable this for Azure Static Web Apps as well Thanks everyone for responding & contributing to this thread! |
@thomasgauvin The super compelling use case for this is streaming OpenAI responses back to the browser. Then I remembered, oh Yeah, I couldn't stream back a large file blob unless I used c#, why would SSE magically work... |
Similar to @sig9, this is a crucial feature for using OpenAI on Azure functions, since OpenAI's responses are somewhat slow, streaming this data to the user is critical for an acceptable experience. Linking: Azure/azure-functions-nodejs-library#97 With import {
app,
HttpRequest,
HttpResponse,
InvocationContext,
} from "@azure/functions";
import { ReadableStream, TextEncoderStream } from "node:stream/web";
const DELAY = 2000;
export function handler(
request: HttpRequest,
context: InvocationContext
): HttpResponse {
context.log(`Http function processed request for url "${request.url}"`);
const stream = new ReadableStream({
start(controller: any) {
controller.enqueue(`
<!doctype html>
<html lang=en-US>
<body>
The time is: ${new Date().toISOString()}<br /><br />
`);
setTimeout(() => {
controller.enqueue(`
${DELAY}ms later it is now ${new Date().toISOString()}
</body>
`);
controller.close();
}, DELAY);
},
});
return new HttpResponse({
body: stream.pipeThrough(new TextEncoderStream()),
status: 200,
headers: {
"Content-Type": "text/html; charset=utf-8",
},
});
}
app.http("test-streaming", {
methods: ["GET", "POST"],
authLevel: "anonymous",
handler,
}); |
Yep, I'm aware that this is a limiting factor when building OpenAI apps (I'm blocked by this myself). The Functions team has confirmed that they're working on this, starting with out of proc .NET, and this should come to JS after that. All the feedback collected in this thread is invaluable to help push for this, especially as this is seemingly a blocker for building OpenAI apps depending on Functions |
@m14t I was facing same limitation with a node function, so I created a C# function and utilized this library to implement streaming and it is working correctly. I call this from a React application and the data is streamed properly. Some of the function implementation was omitted for brevity. OpenAIAPI api = new OpenAIAPI("get from vault or environment variable");
//convert JSON string of messages into an array.
JArray messages = JArray.Parse(strMessages);
List<ChatMessage> chatMessages = messages.ToObject<List<ChatMessage>>();
//List<ChatMessage> chatMessages = new List<ChatMessage>();
var modelVal = Model.ChatGPTTurbo;
var temp = 0.7;
ChatRequest request = new ChatRequest()
{
Model = modelVal,
Temperature = temp,
Messages = chatMessages
};
var response = req.HttpContext.Response;
response.Headers.Add("Content-Type", "text/event-stream");
response.Headers.Add("Cache-Control", "no-cache");
response.Headers.Add("Connection", "keep-alive");
response.StatusCode = 200;
await using var stream = response.Body;
var responseTokens = 0;
await foreach (var token in api.Chat.StreamChatEnumerableAsync(request))
{
Console.Write(token.Choices.First().Delta.Content);
var content = token.Choices.First().Delta.Content;
if (content != null)
{
var contentBytes = Encoding.UTF8.GetBytes(token.Choices.First().Delta.Content);
await stream.WriteAsync(contentBytes);
await stream.FlushAsync();
responseTokens++;
}
}
return new EmptyResult(); |
Related issue that focuses on returning streams from JS Functions: Azure/azure-functions-nodejs-library#97 |
@c-eiser13 Sounds good!!! What version of function app did you create .net ? |
@Petryxasport this is the runtime version in Azure Portal When creating the solution in VS, I used the Azure functions C# template |
@c-eiser13 So, I have created the same function and we can get answers from Open AI on Azure function, however I still have issue with Front end part based on a react application. We are using fetchEventSource from Microsoft and cannot get streaming. |
@Petryxasport on the React side, when a message is typed into the chat window and send button is clicked, here is how that is handled. Messages is my array of {role: string, content: string} objects. const response = await service.StreamCompletion(
messages,
client,
props.model ?? "gpt-3.5-turbo",
props.temperature ?? 0.7
);
const reader = response.body.getReader();
const decoder = new TextDecoder("utf-8");
let sseData = "";
const responseObj = {
role: "assistant",
content: "",
time: format(new Date(), "h:mm aaa"),
id: getRandomString(8)
};
const copy = cloneDeep(messages);
copy.push(responseObj);
const readStream = () => {
reader.read().then(({ done, value }) => {
if (done) {
setState((prev) => mergeState(prev, { loading: false }));
console.log("SSE stream closed");
return;
}
sseData += decoder.decode(value, { stream: true });
responseObj.content = sseData;
setState((prev) => mergeState(prev, { messages: copy }));
setTimeout(() => {
if (layout === "SingleWebPartAppPage") {
const element = document.querySelector('[data-automation-id="contentScrollRegion"]');
if (element) {
element.scrollTo({ top: element.scrollHeight, behavior: "smooth" });
}
} else {
ref?.current?.scrollTo({ top: ref.current.scrollHeight, behavior: "smooth" });
}
}, 100);
readStream();
});
};
readStream(); service.StreamCompletion, is where I call my Azure function and it looks like this: public StreamCompletion = async (messages: IChatMessage[], client: string, model: string, temperature: number) => {
try {
const response = await fetch(this.streamApi, {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
messages: JSON.stringify(messages),
client: client,
model: model,
temperatur: temperature,
}),
});
return response;
} catch (error) {
console.error(`ChatService: StreamCompletion --> error streaming completion: ${error}`);
throw Error(error);
}
}; |
Just pinging the thread to see if there are any updates here. I can use the async HTTP status uri after hacking around a bit but it would really be nice to just stream the resopnse. |
Hi @jeffzi19 which language are you targeting? This issue is generally used to track all languages, which each have a different status. I personally work on Node.js, which you can track using our Roadmap for general timelines or this issue for more specific details. If you're using Python, here is their repo. I don't know if they have an issue tracking this, but I know they're working on it. If you don't see an issue feel free to create one. I'm pretty sure .NET Isolated is already done. I don't think other languages are working on this yet but let me know if you have another language in mind and I can forward you to the right people who might know more. |
@ejizba Thank you for the response! I am using function 4.0 with python right now, I will take a look at their site. We looked into using the .NET isolated but some of the libraries needed by our python side are not compatible or don't have a solid enough .NET implementation to make it an option for us to change right now. Thanks again! |
I want to confess that I am using .net isolated, and I had to use Service Bus Queue to simulate streaming experience of OpenAI. when I get the streaming output, i send the tokens generated grouped by second to the SBQ, and return the full result after the completion is done. SBQ supports AMQP WebSocket, just don't put every token into a new message, the throughput will be fine . It should work with most of the developing stack. |
For those of you on Node.js, we just announced preview support for HTTP streams! 🎉 Learn more in our blog post |
Super excited for the Azure Functions announcement of support for streaming of responses for Node.js Functions! 🎉 This will be a mission-critical feature for building AI apps with Azure Static Web Apps and using streams for returning large payloads, etc. Thanks everyone for the feedback in this thread! |
Oh, exciting, but it took so long coming I moved on. :(
…On Tue, 5 Mar 2024, 13:45 Thomas Gauvin, ***@***.***> wrote:
Super excited for the Azure Functions announcement of support for
streaming of responses for Node.js Functions! 🎉 This will be a
mission-critical feature for building AI apps with Azure Static Web Apps
and using streams for returning large payloads, etc. Thanks everyone for
the feedback in this thread!
—
Reply to this email directly, view it on GitHub
<#1361 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAEXDKUE523HYGJ2TSA25LDYWXEBBAVCNFSM4DGHFKF2U5DIOJSWCZC7NNSXTN2JONZXKZKDN5WW2ZLOOQ5TCOJXHA4DAOJRGUYQ>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
For all Node.js bindings (trigger/input/output) streaming is not supported - the data is put into an in memory buffer first. This can cause out of memory issues for people attempting to process large blobs for example.
We should investigate ways to support a streaming model, as you can optionally do in C# (by binding to a Stream). E.g. we might have a binding hint that allows the function to request a stream, and we provide the user code with a stream over the underlying SDK stream.
Note that if we move out of process for languages, streaming is complicated. We need to consider this when making any new programming model decisions.
The text was updated successfully, but these errors were encountered: