Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ingester not consuming messages with kafka server 2.x #1247

Closed
marqc opened this issue Dec 13, 2018 · 5 comments
Closed

Ingester not consuming messages with kafka server 2.x #1247

marqc opened this issue Dec 13, 2018 · 5 comments

Comments

@marqc
Copy link
Contributor

marqc commented Dec 13, 2018

Requirement - what kind of business use case are you trying to solve?

Use recent kafka server with jaeger-collector and jaeger-ingester.
Specifically Kafka 2.0.1 with log.message.format.version=2.0-IV1

Problem - what in Jaeger blocks you from solving the requirement?

Jaeger depends on Shopify/Sarama:1.16 which does not support recent versions of kafka servers and message formats. With above kafka server config it does not consume messages. No errors/warnings is printed to log.

Proposal - what do you suggest to solve the problem or improve the existing situation?

Update Shopify/Sarama dependency to latest stable version (1.20). https://github.com/Shopify/sarama/releases

Any open questions to address

@Mikedu1988
Copy link

I owe you big time! downgrade kafka version, messages finally got processed.

@ledor473
Copy link
Member

@jaegertracing/jaeger-maintainers sounds like this is closed by #1248

@ivan-klass
Copy link

Still reproducible in 1.13.1.
With debug level enabled, I never reached this line:
https://github.com/jaegertracing/jaeger/blob/master/cmd/ingester/app/consumer/consumer.go#L141, there are no logs after {"level":"info","ts":1563794224.2705138,"caller":"consumer/consumer.go:78","msg":"Starting main loop"}
The app is then killed after "deadlockInterval" (set to 5min), but I'm 100% sure there are messages in topic during this interval (these messages are visible in other consuming tools, also I've binary to generate such messages).

I'm using docker-compose & fast-data-dev

fast-data-dev:
  image: landoop/fast-data-dev:2.2
  ports:
    - 2181:2181
    - 3030:3030
    - 8081-8083:8081-8083
    - 9581-9585:9581-9585
    - 9092:9092
  environment:
    - ADV_HOST=127.0.0.1
    - SAMPLEDATA=0
    - RUNTESTS=0
    - FORWARDLOGS=0
    - DISABLE_JMX=1

jaeger-ingester:
  image: jaegertracing/jaeger-ingester:1.13.1
  command: [
    "--log-level=debug",
    "--cassandra.keyspace=jaeger_v1_dc1",
    "--cassandra.servers=cassandra",
    "--kafka.consumer.brokers=fast-data-dev:9092",
    "--kafka.consumer.encoding=json",
    "--kafka.consumer.topic=my.tracing.zipkin.data"] # plain json payload
  environment:
    - SPAN_STORAGE_TYPE=cassandra
  ports:
    - 14270:14270
    - 14271:14271
  restart: on-failure
  depends_on:
    - fast-data-dev
    - cassandra
    - cassandra-schema

@marqc
Copy link
Contributor Author

marqc commented Jul 22, 2019

@klass-ivan You might be interested in #1640

ingester uses depracated sarama-cluster package, that sets kafka version to 0.9, this patch allows to override this setting with app settings. It is merged to master, but not released yet

@ivan-klass
Copy link

@marqc nevermind, the problem was in kafka advertised host incorrect for in-docker network. Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants