Apache Kafka is an open source stream processing platform. It is a high-throughput distributed publish-subscribe messaging system that can process all action stream data of consumers in the website. All its functions are distributed, highly scalable, and elastic. , fault-tolerant and secure.
A new feature has been released in Apache APISIX 2.14, which is to provide Kafka-type upstream. Users can configure scheme
in the routing to enable Kafka consumer function for the upstream of Kafka, so as to achieve in various environments News subscription.
This article will introduce the Kafka publish and subscribe function and the usage details of the kafka-proxy
plug-in, and show you how to use APISIX in combination with Kafka in scenarios with limited connections such as browsers to achieve message consumption in Kafka.
principle
Kafka uses a custom TCP protocol to communicate between brokers and consumers. In Apache APISIX, you can implement this part of the proxy through the four-layer proxy, but for the scenario where the end user (such as browser, etc.) communicates with the Broker and cannot directly use the TCP connection, there is no way to support it well.
Now, the client can connect to APISIX through WebSocket, establish a connection with Kafka inside APISIX, and then process the client's commands (such as get offset, get message, etc.). Through the connection of WebSocket, you can avoid pulling messages from Kafka in scenarios where the browser cannot directly use the TCP connection.
It can be seen from the above flow chart that the custom Protobuf protocol is used as the communication protocol, and it can be used in programs in multiple languages through a convenient compilation process. In the process, the offset is mainly obtained through the ListOffsets command, and the message is obtained through the Fetch command.
In addition, since the function of Kafka consumer is based on the APISIX PubSub framework, you can also extend other message systems based on this framework to achieve richer publish and subscribe capabilities.
how to use
Set up Kafka routing
APISIX has added a new upstream Scheme type, which now supports Kafka in addition to HTTP, GRPC and other protocols. During the operation, just set the scheme
field value to kafka
, nodes
field to the address and port of Kafka Broker, then APISIX can be opened. Kafka publish-subscribe support.
curl -X PUT 'http://127.0.0.1:9080/apisix/admin/routes/kafka' \
-H 'X-API-KEY: ${api-key}' \
-H 'Content-Type: application/json' \
-d '{
"uri": "/kafka",
"upstream": {
"nodes": {
"kafka-server:9092": 1
},
"type": "none",
"scheme": "kafka"
}
}'
(Optional) Set up the TLS handshake
When Kafka's upstream tls
field exists, APISIX will open the TLS handshake for the connection, and there is also a tls
verify
field in ---0f0b1930a36c7edcb53e9a83958206d4--- which you can control by Whether to verify the server certificate during the TLS handshake.
curl -X PUT 'http://127.0.0.1:9080/apisix/admin/routes/kafka' \
-H 'X-API-KEY: ${api-key}' \
-H 'Content-Type: application/json' \
-d '{
"uri": "/kafka",
"upstream": {
"nodes": {
"kafka-server:9092": 1
},
"type": "none",
"scheme": "kafka",
"tls": {
"verify": true
}
}
}'
(Optional) Set up SASL/PLAIN authentication
In order to support the authentication function, APISIX also provides the kafka-proxy
plug-in, through which users can configure the SASL authentication function for Kafka routing. Currently, it only supports PLAIN mode authentication.
curl -X PUT 'http://127.0.0.1:9080/apisix/admin/routes/kafka' \
-H 'X-API-KEY: ${api-key}' \
-H 'Content-Type: application/json' \
-d '{
"uri": "/kafka",
"plugins": {
"kafka-proxy": {
"sasl": {
"username": "user",
"password": "pwd"
}
}
},
"upstream": {
"nodes": {
"kafka-server:9092": 1
},
"type": "none",
"scheme": "kafka"
}
}'
Setting up the client and testing
You can get the PubSub protocol definition from the Apache APISIX GitHub repository, which contains Kafka's commands and responses, and needs to be compiled into the SDK in the desired language.
After that, the client (such as a browser) can connect the previously set Kafka routing URI ws://127.0.0.1:9080/kafka
through WebSocket, and send PubSubReq
data to it. This contains the commands that need to be used, APISIX will get data from Kafka and send PubSubResp
response data to the client.
After use, just remove the route to close the Kafka function of the endpoint. Thanks to the dynamic nature of APISIX, status updates can be achieved without restarting.
Summarize
For more description and complete configuration list of Kafka publish and subscribe function, you can refer to the following link:
- Kafka publish and subscribe function: https://apisix.apache.org/zh/docs/apisix/next/pubsub/kafka
-
kafka-proxy
plugin: https://apisix.apache.org/en/docs/apisix/next/plugins/kafka-proxy
At present, Apache APISIX is also supporting the publish and subscribe capabilities of other message systems. If you are interested in this, you are also welcome to read the development documentation of the publish and subscribe (PubSub) module https://apisix.apache.org/zh/docs/apisix /next/pubsub .
If you have any ideas, you can also start a discussion in the GitHub Discussion, or communicate via the community mailing list.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。