-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"Too many open files" when building large jars inside VM/chroot environment #6
Comments
It's definitely the parallel algorithm consuming more file handles in parallel. How many file handles do you have available ? |
My hard and soft open files limits are 4096 and 1024, respectively. The jar has >10,000 files and this has never been a problem before. Increasing these limits is not a solution since not only is this a regression in behaviour, but the number of files in this jar is likely to increase over time. However, in my tests I only see this in a virtualised environment, not on a real system. My initial thought is that the slower IO could be keeping files open longer so the number open concurrently is higher than it would be on real hardware. Does this sound plausible? |
Please see pull request #7 that adds a test case to illustrate this problem. (Simple test to add 15000 files to a compressed zip archive.) This reproduces the failure for me on real hardware, not only VM |
Issues fixed on 2.x branch. Will be merged to master (3.x) "soon". Thanks for the excellent testcase ! |
I am trying out the latest plexus archiver dc873a4
When I am building a large jar file inside a mock chroot environment inside a VM, I see a "Too many open files" exception, as shown below.
This seems to be a problem with using ParallelScatterZipCreator from commons-compress 1.10 because my problem went away when I patched it out: http://pkgs.fedoraproject.org/cgit/plexus-archiver.git/tree/0001-Avoid-using-ParallelScatterZipCreator.patch
I encounter this when building the Eclipse platform in Fedora Linux build infrastructure, but I will try to generate a more concise test case...
The text was updated successfully, but these errors were encountered: