Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming for large responses #461

Open
odisseus opened this issue Mar 14, 2023 · 0 comments
Open

Streaming for large responses #461

odisseus opened this issue Mar 14, 2023 · 0 comments

Comments

@odisseus
Copy link
Contributor

odisseus commented Mar 14, 2023

When working with very large projects (on the order of 10 thousand targets), server responses that contain target information tend to be very large too. Such responses can be very costly to prepare and send at once, in terms of both time and memory. The resulting JSON message can even exceed the size limit for some clients.

However, in many cases such response consists of a sequence of data objects that don't depend on each other. An obvious example of such response is workspace/buildTargets. The server could theoretically emit these objects in portions of smaller size, so that the client may process them at its own pace.

In order to support large projects, I think the protocol must support streaming the data objects in those responses where this is applicable. For what it's worth, there are implementations of JSON-RPC 2.0 that allow streaming.

As a workaround, we could split the response into chunks and send them asynchronously as notifications. This would work in a manner similar to buildTarget/compile.

@odisseus odisseus changed the title Make the workspace/buildTargets response asynchronous Streaming for large responses Mar 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant