-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Opening and destroying the same entry causes high memory usage #87
Comments
I've reproduced the issue. I'll look into this and get back to you. |
i was able to resolve the issue by modifying the I'll make a PR against fd-slicer with this feature, and then update yauzl to depend on it. |
published yauzl 2.9.2 that should fix this issue. give it a try and let me know what you observe. |
I tested and it works fine, even with larger files (tested with a 300 MB zip). The memory usage is steady and low. Many thanks! |
I need to read the first row of a zipped csv file to create a table in a postgresql database, and then pg-copy the whole csv file to the table.
I open the same zip entry twice with
zipFile.openReadStream
. After I read the first chunk, I destroy the stream (withreadStream.destroy()
) and I open it again. Everything works as I expected, except the memoryUsage:I have a ~1,6 GB csv file compressed to a 110 MB zip. When I open the entry for the first time, the max memory usage (RSS) is 120 MB. After re-opening the same entry, the RSS goes up high, from 120 MB to 1,7 GB. If I don't destroy the entry on the first run, and let it finish, the max RSS is 150 MB and it stays this low after I reopen the entry, so I don't think the problem is with re-opening the entry. There must be wrong how I destroy it.
My system: Windows 10, Node 10.
The excerpt of my code:
Memory usage:
The text was updated successfully, but these errors were encountered: