Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question: msgId and possible memory leak? #526

Closed
richard-ramos opened this issue Mar 9, 2023 · 3 comments
Closed

question: msgId and possible memory leak? #526

richard-ramos opened this issue Mar 9, 2023 · 3 comments

Comments

@richard-ramos
Copy link
Contributor

richard-ramos commented Mar 9, 2023

Hello! I'm trying to debug a memory leak I see in my project. After executing pprof --inuse_objects (to see objects allocated but not yet released), I end up with the following stats:

File: waku
Build ID: 9a40b11966dcdbae7d63435fc764c898ae39db63
Type: inuse_objects
Time: Mar 9, 2023 at 6:06pm (UTC)
Entering interactive mode (type "help" for commands, "o" for options)
(pprof) top 10
Showing nodes accounting for 2068186, 97.91% of 2112367 total
Dropped 175 nodes (cum <= 10561)
Showing top 10 nodes out of 64
      flat  flat%   sum%        cum   cum%
   1392682 65.93% 65.93%    1392682 65.93%  github.com/waku-org/go-waku/waku/v2/protocol/relay.msgIdFn
...

(pprof) list relay.msgIdFn
Total: 2112367
ROUTINE ======================== github.com/waku-org/go-waku/waku/v2/protocol/relay.msgIdFn in /root/go-waku/waku/v2/protocol/relay/waku_relay.go
   1392682    1392682 (flat, cum) 65.93% of Total
         .          .     54:func msgIdFn(pmsg *pubsub_pb.Message) string {
   1392682    1392682     55:	return string(utils.SHA256(pmsg.Data))
         .          .     56:}

So far I havent been able to successfully identify the reason for this custom messageID function to be present in the top10, but in the meantime I was looking at the code that calls the message Id function and saw this code:

delete(mc.msgs, entry.mid)
delete(mc.peertx, entry.mid)

Since maps do not shrink when elements are deleted as described here golang/go#20135, isn't it possible that a memory leak exists for mc.msgs and mc.peertx depending on the volume of messages that is received in a node?

@nisdas
Copy link
Contributor

nisdas commented Mar 15, 2023

Just saw this issue, but its most likely fixed here #528

There has been a new release with the change, so you could give that a shot and see if it fixes your issue.

@vyzo
Copy link
Collaborator

vyzo commented Mar 15, 2023

yes, that's probably it. Sorry guys!

@richard-ramos
Copy link
Contributor Author

Thank you! this fixes the issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants