Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Persistent high memory usage after using the new files API #2615

Closed
matthewrobertbell opened this issue Apr 28, 2016 · 9 comments
Closed

Persistent high memory usage after using the new files API #2615

matthewrobertbell opened this issue Apr 28, 2016 · 9 comments

Comments

@matthewrobertbell
Copy link

matthewrobertbell commented Apr 28, 2016

Hi,

I'm running the 64 bit linux 0.4 from the releases page ona 1GB digital ocean server. I added a bunch of small files using the files API (files/mkdir and files/write). After each batch, the memory usage of IPFS increases, and doesn't drop (I waited over 8 hours and it's stable). It's currently at 800MB, which seems excessive. I've stopped adding files, for fear of running out of memory.

I tried running "ipfs repo gc", but memory usage did not decrease.

I have not tried using other commands, so I don't know if the files API is specifically to blame.

The process is currently running, I'm happy to dump whatever info you wish, none of the data I have added is private.

Cheers

@matthewrobertbell
Copy link
Author

screenshot 2016-04-28 12 13 34

No files were added after around 11PM. Maybe the high CPU and high memory are related?

@skmgoldin
Copy link

I observe similar memory and CPU usage running go-ipfs v0.4.1 on Ubuntu 14.04. It does seem excessive, v0.3.* ran lighter.

@whyrusleeping
Copy link
Member

@mattseh I think youre right, the files api isnt cleaning up after itself as it should.

@skmgoldin If youre not using the files API and seeing excessive CPU and memory usage, please gather the debug information discussed here: https://github.com/ipfs/go-ipfs/blob/master/debug-guide.md and send it to me, It will help me find the source of any leaks and memory/cpu hogs

@matthewrobertbell
Copy link
Author

matthewrobertbell commented May 18, 2016

The same script seems to be causing rising ram usage on IPFS 0.4.2, although rising much slower than 0.40, with a single threaded script adding a bunch of mostly small files using the files API. It's currently at about 530MB.

CPU usage is 200% (both cores of 2GB VM), but it does die down when files are no longer added, which IIRC is an improvement.

The daemon did crash once, with:

panic: close of nil channel

goroutine 4093840 [running]:
panic(0xc94860, 0xc828caa140)
/home/whyrusleeping/go/src/runtime/panic.go:464 +0x3e6
gx/ipfs/QmaDNZ4QMdBdku1YZWBysufYyoQt1negQGNav6PLYarbY8/go-log.(*bufWriter).loop(0xc830bdbaa0)
/builds/distributions/dists/go-ipfs/gopath/src/gx/ipfs/QmaDNZ4QMdBdku1YZWBysufYyoQt1negQGNav6PLYarbY8/go-log/writer.go:233 +0x2f5
created by gx/ipfs/QmaDNZ4QMdBdku1YZWBysufYyoQt1negQGNav6PLYarbY8/go-log.newBufWriter
/builds/distributions/dists/go-ipfs/gopath/src/gx/ipfs/QmaDNZ4QMdBdku1YZWBysufYyoQt1negQGNav6PLYarbY8/go-log/writer.go:151 +0xbf

I will keep the script and daemon running, and report any changes.

@matthewrobertbell
Copy link
Author

It is now 692MB. @whyrusleeping or other IPFS team members, you can have access to this VPS to get data if you wish.

@ghost
Copy link

ghost commented May 18, 2016

Hey @mattseh thanks for reporting this -- check out debug-guide.md for debugging these kinds of issues and providing useful information. Thanks!

@matthewrobertbell
Copy link
Author

matthewrobertbell commented May 18, 2016

Here's the debug info, https://ipfs.io/ipfs/QmedpSbkZxUQQpwRiM2eVdGWUCnDZsr51YtRLdv4UdFANv I'll try and look for stuff myself as well.

The IPFS version is 0.4.2 Linux 64 bit official release.

@matthewrobertbell
Copy link
Author

I missed the stack dump: https://ipfs.io/ipfs/QmfTmdf8r7g9X6BW3gEwfKrdvPdBFijdSiCDWAMew7aThz As per the debug guide, there's a bunch of goroutines that have been hung for many minutes.

This was referenced May 18, 2016
@ghost ghost mentioned this issue May 29, 2016
@ghost
Copy link

ghost commented May 29, 2016

@mattseh thanks for providing these! I linked them in #1924 and we'll continue to track the issue there

@ghost ghost closed this as completed May 29, 2016
This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants