-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Exporting a big file (>50MB) produces a V8 memory heap error #34
Comments
Thx for reporting this bug. We did need a limit on the size of exportable csv to avoid out of memory issue. Maybe configure this as a plugin config item, with a default value. |
Technically I don't think there is a necessity to define a size limit. The current issue is due to the fact that the CSV is created only after having retrieved all the matching data. I made a quick implementation on the following branch but I have not tested it nor finalized it yet : https://github.com/MKCG/dashboards-reports/pull/1/files May I open a draft pull request on this repository to suggest a streaming implementation ? |
Yes, a draft PR will be better. Thx |
Ok, I will do that maybe next week. |
) (opensearch-project#35) Signed-off-by: Joshua Li <[email protected]> (cherry picked from commit 7a60ae8a95968d8f977cfc5a48984789e240f20b) Co-authored-by: Joshua Li <[email protected]>
Describe the bug
When a report is generated containing a lot of document then the Javascript v8 engine might suffers from a heap out of memory exception and crash.
I believe that it is caused by the fact that the export process keeps all content that should be exported in memory before writing all at once into the CSV file.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
Their should be almost no limit to the size of the exported CSV file and it should be possible to generate CSV file way bigger than the memory allocated to the Javascript heap.
Example of a trace
Host/Environment (please complete the following information):
The text was updated successfully, but these errors were encountered: