-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ram usage insanity. #10
Comments
See this comment: facebook/rocksdb#3216 (comment) it appears that RAM usage goes as 3x disk usage and that almost all disk usage is in Limiting the number of sst files that rocks can create by setting |
With all-new, all-improved code base, after a run of shapes+disjuncts classsification: total of 1108125 atoms, guile has a 4.8GB heap. of which all but 20MB are free. Total RAM use is 42GB. Rocks stats -- So it seems that 37 sst file descriptors have been leaked - all of these are to sst files that have been deleted: So 42GB and 37 leaked file descs is a vast improvement over 180GB+corruption seen before. But still not acceptable. This is w/ ubuntu focal 20.04 and Exiting and restarting shows 5GB RAM use, so that's maybe/mostly all atomspace RAM!? and not rocks RAM. So that implies of the 42GB above, 32GB was leaked by rocks. Yow! |
Retest, using rocksdb version 6.19.0 compiled from github source bb75092574532c5629c27dcd99fe55f5514af48c It appears that even the latest version is leaking file descs: RAM usage after computing
ls -la shape.rdb/*sst | wc # 10 (cog-close storage-node) (cog-open storage-node) stop guile, start guile, stop guile: ======= Now do classification
(cog-close storage-node) (cog-open storage-node) Yikes! this time it leaks 74 file descriptors! Ouch! |
Fixed. Code was leaking iterators. Fixed in commit# 10fb460 Log: after finishing: after cog-close: after cog-open: after cog-close: after exiting guile: |
During learning on a tiny grammar, RAM usage by RocksDB exploded to 90GBytes. This is .. insane, it should not be more than a few GBytes for this workload. This is 40x greater RAM usage than expected ... the 40x number is just like the one in issue #9 and might be curable in the same way...
The text was updated successfully, but these errors were encountered: