Apache Kafka Producer Example Using CLI
Requirements
Before we dive in, you need to ensure that you have an Apache Kafka cluster running on your machine. You can quickly spin up a Kafka container using the docker-compose which is provided in the YAML file as shown in the following:
services:
zookeeper:
image:bitnami/zookeeper:3.8
ports:
- "2181:2181"
volumes:
- "zookeeper_data:/bitnami"
environment:
- ALLOW_ANONYMOUS_LOGIN=yes
kafka:
image: docker.io/bitnami/kafka:3.3
ports:
- "9092:9092"
volumes:
- "kafka_data:/bitnami"
environment:
- KAFKA_CFG_ZOOKEEPER_CONNECT=zookeeper:2181
- ALLOW_PLAINTEXT_LISTENER=yes
depends_on:
- zookeeper
volumes:
zookeeper_data:
driver: local
kafka_data:
driver: local
Run the following command to start the container as shown in the following:
This should start the container and map the zookeeper and Kafka services to ports 2181 and 9092, respectively.
Produce Messages from CLI
To produce a message to a Kafka topic, you must ensure that the Kafka topic exists and use the Kafka-console-producer.sh utility.
Start by creating an example topic for the cluster with the “kafka-topics.sh” command. An example is shown in the following:
The previous command creates a Kafka topic called “sample_topic” with one partition and one replica.
To start producing messages to the topic that we previously created, we can use the kafka-console-producer.sh
The command is as shown in the following:
Once you run the previous command, Kafka opens an input buffer where you can produce messages to the Kafka topic. Each message takes up one line.
>Second message
>third message
>....
>.....
>......
>.......
>........
>^C
To end the message input, you can terminate it using the CTRL+ C input.
Keep in mind that the messages are sent with a null key by default.
Conclusion
We explored the basics of working with the Kafka Producer CLI tool to write the messages to an existing Kafka topic.
Source: linuxhint.com