-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proposal: support subscribe to messages as Kafka consumer #6874
Comments
Any real user cases for this? |
Maybe Publish & Subscribe, which looks like dapr, shields message broker differences for developers. But is it a suitable scenario for apisix? |
It would be much better to have an architecture diagram to explain this new architecture. In this model, what is the model relationship between APISIX and kafka's consumer group? |
Now we have no way to support consumer group capabilities (i.e. offsets need to be managed by the client itself). |
Done |
Background
Apache Kafka is an open source stream processing platform developed by Apache and written in Scala and Java. It is a high throughput distributed publish-subscribe messaging system that handles all action stream data from consumers in a website.
Previously, APISIX was only supported as a producer for Kafka to be used as a logger. Therefore, we wanted to add this feature to APISIX to allow end users to pull messages from Kafka via a protocol such as WebSocket so that APISIX could support the publish/subscribe scenario.
Scheme
Overview
We thought it would be a good idea to treat Kafka as an upstream type, reuse the upstream object as the data source, and use a route plugin to configure authentication for Kafka broker.
First, We'll create a plugin that provides a simple function of writing the plugin configuration into an APISIX variable(ctx).
Next, add kafka as the upstream scheme to the upstream schema definition, add a validation mode that supports filling in only nodes and scheme, and refine the schema validation code for it.
Finally, the Kafka consumer logic will be entered in the
http access phase
entry code based on the upstream scheme field.Kafka service
This will open the WebSocket service on the configured route uri and interact with the client via a protobuf-like based protocol. At this stage, we do not support Kafka Consumer Group capability, so we will implement a simple version where only the client manages the consumer offset, which is passed in actively by the client when pulling messages.
Protocol
Currently, we expect to use protobuf as the communication encoding protocol to deliver the command and response results.
Configure
Core
We need to extend the upstream configuration to support both cases, filling in only scheme+nodes.
Plugin
Implementation
http_access_phase
processing code: initialize the consumer when the scheme iskafka
The text was updated successfully, but these errors were encountered: