forked from elastic/kibana
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[Reporting] Fix slow CSV with large max size bytes (elastic#120365)
* use Buffer.alloc + .set API instead of .concat * refactor variable names and actually assign to this.buffer * ok, looks like an array of buffers could work * added large comment and refactored some variable name * fix comment * refactored logic to deal with an edge case where partial buffers should be added, also throw if bad config is detected * added new test for detecting when the write stream throws for bad config * updated logic to not ever call .slice(0), updated the guard for the config error, updated a comment * refactor totalBytesConsumed -> bytesToFlush * use the while loop mike wrote * remove unused variable * update comment Co-authored-by: Kibana Machine <[email protected]>
- Loading branch information
1 parent
d39dff3
commit 07acac9
Showing
1 changed file
with
45 additions
and
9 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters