Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak starting in 4.3.5/4.3.6 sending large binary messages #1437

Closed
tsightler opened this issue Mar 3, 2022 · 7 comments
Closed

Memory leak starting in 4.3.5/4.3.6 sending large binary messages #1437

tsightler opened this issue Mar 3, 2022 · 7 comments
Assignees

Comments

@tsightler
Copy link

Over the last week I've been struggling to track down a memory leak which I initially though was due to some code refactor on my side, however, after much troubleshooting, and effectively reverting all of my code changes, the problem was still occurring. My project uses MQTT to send lots of basic sensor data in simple text and json formats, but also sends a set of fairly small (~15-30KB) JPG images via MQTT every 30 seconds or so.

I eventually found that if I disabled the image function the leak appeared to go away, however, I've been maintaining this project for a few years and had never seen this behavior previously, even in cases where the code ran for weeks/months. I started comparing a known working Docker image with my local dev system and noticed that the only significant difference was the change from MQTT 4.3.4 to 4.3.6. Sure enough, as soon as I reverted my development environment to 4.3.4 the problem goes away. Switch to 4.3.5 or 4.3.6, and the problem comes back.

I tried to look at the code changes in 4.3.5 and I'm actually quite hesitant to open this issue as the changes seem super minor, and it's not at all clear to me how they could cause this behavior, but I don't currently understand the code path there. All I can say is that, for my case, it so far seems to be 100% reproducible. Using 4.3.4, sending the JPG images via MQTT works great with no long term memory growth, while with 4.3.5/4.3.6, everything still seems to work, but node process memory will grow at~10MB/hr.

I can try to make a simple reproducer if that would help but I'm hoping that maybe just pointing it out might trigger some thought from someone that knows the code path better and understand better why the change in 4.3.5 was made and how that change could possibly cause this behavior. Thanks!

@cluyet
Copy link

cluyet commented Mar 4, 2022

Hi, in my case I have a different problem: since version 4.3.5 my connection to MQTT is "blocked" as soon as I unsubscribe from a topic: the connection is not closed, but I don't receive any message anymore from MQTT. The same code works perfectly with version 4.3.4.

Looking at the code change in 4.3.5 I also don't understand how these changes could create my problem.
At the moment I don't want to open a ticket for this until I have more information, so I'm also very much interested if someone can better explain the changes between 4.3.4 and 4.3.5.

@tsightler
Copy link
Author

I think my issue may be related to #1424. I'll try adding a callback to the publish and see what happens.

@YoDaMa
Copy link
Contributor

YoDaMa commented Mar 7, 2022

@tsightler thanks for filing this issue. Could you provide a minimal reproducible sample in an online IDE environment? This would be useful to integrate as a longhaul memory leak test anyway even as we debug this current issue.

@tsightler
Copy link
Author

@YoDaMa I'd be happy to try, but it may be beyond my limited abilities as I've never really used any such environment.

Regardless, I've verified that the issue goes away if I add a dummy callback function, so I'm going to close this as a duplicate of #1424 as I'm confident this is the same issue, just appears as a leak since I'm sending larger messages.

@YoDaMa
Copy link
Contributor

YoDaMa commented Mar 9, 2022

we will sort this out. thanks!

@BertKleewein
Copy link
Contributor

@tsightler - sorry to cause you problems. This is fixed for next release: #1443

@tsightler
Copy link
Author

@BertKleewein It's no problem, thanks to everyone who works on this project for getting it squared away!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants