-
Notifications
You must be signed in to change notification settings - Fork 82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Investigate switching global allocator to mimalloc #81
Comments
Making the switch to use
Further performance investigation and validation on various platforms will be required. |
The performance of using |
Eyeballing some output from a few runs under |
Thanks for the work on this. For me the 'pull' time saved vs. the few seconds (since its only a single repo) almost makes the non-debian image attractive (even with the 3x slowdown). Since most repos we scan are smallish, and scan times are measured in seconds. |
@munntjlx I have a branch I'll be pushing and merging "soon", which switches the allocator to mimalloc, sidesteps the alpine performance issue, and seems to all around work. I'm thinking the next Nosey Parker release could provide both the glibc-based Docker image as well as an Alpine-flavored one. |
Great news! Its funny how much difference an allocator can make! |
This has been merged back to |
I leave an emoticon with my thanks and gratitude: 😍 |
Using
musl
instead ofglibc
when building Nosey Parker results in a significant drop in scan performance, presumably due to the allocator implementation inmusl
not supporting threaded workloads very well (see here).It may be possible to sidestep this by using a different global allocator in Nosey Parker. In particular, it appears that
jemalloc
does not build withmusl
. Butmimalloc
does build there, and there is a Rust crate for it already.Is it easy to switch Nosey Parker to use
mimalloc
as its global allocator?How does switching impact performance of native-code builds? How does it affect performance of Docker-based builds, particularly the Alpine-based build in #77?
The text was updated successfully, but these errors were encountered: