-
Notifications
You must be signed in to change notification settings - Fork 3.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Subscribing to non-existant topic silently fails #1201
Comments
Hi, |
This is still an issue. |
Thanks for the quick reply.
Do you know if this bug will be resolved soon?
בתאריך יום ה׳, 25 ביולי 2019, 11:17, מאת Magnus Edenhill <
[email protected]>:
… This is still an issue.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#1201>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AMWMUJP3UFRLVWOBYV6UUOTQBFOQVANCNFSM4DKCLNMA>
.
|
Will the consumer get messages once the topic is created? |
Yes, the new topic will be picked up within topic.metadata.refresh.interval.ms, updating the subscription, rejoining the group, and starting to consume. |
Tracked in #1540 |
Description
See confluentinc/confluent-kafka-dotnet#166
With automatic topic creation is disabled on the broker, calling Consumer.Subscribe() on a non-existent topic, or a list of topics where one does not exist, does not return an error or generate a partition assignment. No error callbacks or partition assignment callbacks are invoked.
From the logs it looks like it will wait until all topics exist before subscribing to any of them.
How to reproduce
Reproduced using the advanced consumer example in the .Net client, subscribing to three topics where one of them doesn't exist.
Checklist
Please provide the following information:
librdkafka version (release number or git tag): 0.9.5
Apache Kafka version: 10.0.1
librdkafka client configuration:
group.id=advanced-csharp-consumer
enable.auto.commit=false
auto.commit.interval.ms=5000
statistics.interval.ms=60000
bootstrap.servers=X
debug=cgrp
Operating system: Windows client
Using the legacy Consumer
Using the high-level KafkaConsumer
Provide logs (with
debug=..
as necessary) from librdkafka (see below)Provide broker log excerpts
Critical issue
7|2017-05-04 12:03:49.366|rdkafka#consumer-1|CGRPQUERY| [thrd:main]: kafka-bv-dev01-01.xxxx.com:9092/1: Group "advanced-csharp-consumer": querying for coordinator: intervaled in state up
7|2017-05-04 12:03:49.366|rdkafka#consumer-1|CGRPCOORD| [thrd:main]: kafka-bv-dev01-02.xxxx.com:9092/2: Group "advanced-csharp-consumer" coordinator is kafka-bv-dev01-02.xxxx.com:9092 id 2
7|2017-05-04 12:03:50.315|rdkafka#consumer-1|SUBSCRIPTION| [thrd:main]: Group "advanced-csharp-consumer": effective subscription list changed from 0 to 2 topic(s):
7|2017-05-04 12:03:50.315|rdkafka#consumer-1|SUBSCRIPTION| [thrd:main]: Topic test with 1 partition(s)
7|2017-05-04 12:03:50.316|rdkafka#consumer-1|SUBSCRIPTION| [thrd:main]: Topic test2 with 1 partition(s)
7|2017-05-04 12:03:50.316|rdkafka#consumer-1|JOIN| [thrd:main]: Group "advanced-csharp-consumer": join with 2 (3) subscribed topic(s)
7|2017-05-04 12:03:50.316|rdkafka#consumer-1|CGRPMETADATA| [thrd:main]: consumer join: metadata for subscription only available for 2/3 topics (0ms old)
7|2017-05-04 12:03:50.316|rdkafka#consumer-1|JOIN| [thrd:main]: Group "advanced-csharp-consumer": postponing join until up-to-date metadata is available
7|2017-05-04 12:03:50.316|rdkafka#consumer-1|CGRPCOORD| [thrd:main]: kafka-bv-dev01-01.xxxx.com:9092/1: Group "advanced-csharp-consumer" coordinator is kafka-bv-dev01-02.xxxx.com:9092 id 2
7|2017-05-04 12:03:50.802|rdkafka#consumer-1|JOIN| [thrd:main]: Group "advanced-csharp-consumer": join with 2 (3) subscribed topic(s)
7|2017-05-04 12:03:50.802|rdkafka#consumer-1|CGRPMETADATA| [thrd:main]: consumer join: metadata for subscription only available for 2/3 topics (500ms old)
7|2017-05-04 12:03:50.803|rdkafka#consumer-1|JOIN| [thrd:main]: Group "advanced-csharp-consumer": postponing join until up-to-date metadata is available
The text was updated successfully, but these errors were encountered: