Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak #1060

Closed
reallovelei opened this issue Feb 28, 2018 · 3 comments
Closed

Memory leak #1060

reallovelei opened this issue Feb 28, 2018 · 3 comments

Comments

@reallovelei
Copy link

reallovelei commented Feb 28, 2018

Versions

Please specify real version numbers or git SHAs, not just "Latest" since that changes fairly regularly.
Sarama Version: 1.14.0
Kafka Version: 2.11-0.11.0.1
Go Version: go version go1.9.2 linux/amd64

Configuration

What configuration values are you using for Sarama and Kafka?

Logs

When filing an issue please provide logs from Sarama and Kafka if at all
possible. You can set sarama.Logger to a log.Logger to capture Sarama debug
output.

Problem Description

heap profile: 0: 0 [38: 2051184] @ heap/1048576
0: 0 [0: 0] @ 0x850310 0x83e044 0x83cef0 0x84d2bf 0x84babd 0x871a3a 0x86d763 0x45e681

0x85030f hardess/vendor/github.com/Shopify/sarama.versionedDecode+0x4f /usr/local/src/go/src/hardess/vendor/github.com/Shopify/sarama/encoder_decoder.go:78

0x83e043 hardess/vendor/github.com/Shopify/sarama.(*Broker).sendAndReceive+0x243 /usr/local/src/go/src/hardess/vendor/github.com/Shopify/sarama/broker.go:428

0x83ceef hardess/vendor/github.com/Shopify/sarama.(*Broker).Fetch+0x6f /usr/local/src/go/src/hardess/vendor/github.com/Shopify/sarama/broker.go:259

0x84d2be hardess/vendor/github.com/Shopify/sarama.(*brokerConsumer).fetchNewMessages+0x4ae /usr/local/src/go/src/hardess/vendor/github.com/Shopify/sarama/consumer.go:811

0x84babc hardess/vendor/github.com/Shopify/sarama.(*brokerConsumer).subscriptionConsumer+0x13c /usr/local/src/go/src/hardess/vendor/github.com/Shopify/sarama/consumer.go:697

0x871a39 hardess/vendor/github.com/Shopify/sarama.(*brokerConsumer).(hardess/vendor/github.com/Shopify/sarama.subscriptionConsumer)-fm+0x29 /usr/local/src/go/src/hardess/vendor/github.com/Shopify/sarama/consumer.go:638

0x86d762 hardess/vendor/github.com/Shopify/sarama.withRecover+0x42 /usr/local/src/go/src/hardess/vendor/github.com/Shopify/sarama/utils.go:43

@eapache
Copy link
Contributor

eapache commented Feb 28, 2018

This seems a lot like #1046?

The "problem description" is just a stack trace; is that where you think the leak is occurring? do you have a more complete profile you can share showing it actually building over time? That location will definitely allocate a lot of memory so it may not be an actual leak.

As a general rule, Go has garbage collection so it's very difficult to leak memory.

@eapache
Copy link
Contributor

eapache commented Feb 28, 2018

I've answered similar tickets a lot, so I put a more detailed answer in the FAQ now: https://github.com/Shopify/sarama/wiki/Frequently-Asked-Questions#why-is-the-consumer-leaking-memory-andor-goroutines

@chandradeepak
Copy link
Contributor

@reallovelei
i have faced similar issue too and this is how we solved it

see if that makes any difference for you
By default sarama stores 256 messages for each partition in partition consumer . So in your case it if you have 4 partitions and each message is is close to 1mb them 1GB is allocated in memory .
And if you have multiple consumer working then it would be very high.
so reduce the Config.ChannelBufferSize( sarama config) to less like 10 . so that way each partition is consuming only 10mb and 4 partitions is 40mb .
below is the property i am talking about
https://github.com/Shopify/sarama/blob/master/config.go#L262

@eapache eapache closed this as completed Apr 14, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants