Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Backports the following commits to 7.x:
Dev Docs
Request batching and response streaming functionality of legacy Interpreter plugin has been moved out into a separate
bfetch
new platform plugin. Now every plugin can create server endpoints and browser wrappers that can batch HTTP requests and stream responses back.As an example, we will create a batch processing endpoint that receives a number then doubles it
and streams it back. We will also consider the number to be time in milliseconds
and before streaming the number back the server will wait for the specified number of
milliseconds.
To do that, first create server-side batch processing route using
addBatchProcessingRoute
.Now on client-side create
double
function usingbatchedFunction
.The newly created
double
function can be called many times and itwill package individual calls into batches and send them to the server.
Note: the created
double
must accept a single object argument ({ num: number }
in this case)and it will return a promise that resolves into an object, too (also
{ num: number }
in this case).Use the
double
function.