-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Something is locking the memory mappings #18
Comments
Interestingly, I can't find anything in go-ipfs locking pages into memory. However, |
And |
I give up. Even after hours of inactivity, something's still locking 1.3 GiB of mmaped badger memory and I can't find anything in the go compiler/runtime/libraries or any of our libraries that could possibly making the necessary syscalls (and nothing in my strace runs either). |
Until the value log file is closed or GC'ed.
I doesn't (AFAIK).
In a normal
(For some reason Badger mmaps twice the value log file size, i.e., 2 GB.)
Could you provide me a script to deterministically reproduce your use case? |
Doing an
Rollup:
|
@Stebalien Sorry, I still can't reproduce this, could you provide me a specific hash (or set of commands to produce that hash). Also, I have no idea what the rollup is. |
The rollup is just /proc/pid/smaps_rollup (showing totals). |
My kernel appears to be unrelated. Could you try on tmpfs? Thinking this might be due to my filesystem setup (btrfs with CoW and compression), I tested IPFS in /tmp. However, now I'm thinking that tmpfs may have the same issue. |
Still can't reproduce. I've mounted the
but none of the Knowing now the |
IIRC, I had other vlogs mapped as well that were using more memory (those were just a sample). I'm running 4.16 so that may make a difference. |
So, I'm no longer seeing this. Most pages now "private clean". Not exactly sure what happened. |
So, badger mmaps files and leaves them mapped. While I can't find anything in badger that explicitly locks these mappings into memory, something is locking ranges (go?) and it's killing my memory.
To reproduce, write a bunch of data (gigabytes) into a badger datastore and then read them out.
The text was updated successfully, but these errors were encountered: