| by Arround The Web | No comments

Apache Kafka Producer Example in Python

Python has quickly become one of the most influential and easy-to-learn languages of the modern age. Thanks to swift and constant development, it is always evolving with new features and performance enhancements.

In this tutorial, however, we will focus on learning how to create a Kafka producer in Python using the Confluent Kafka Python client library. This library provides a high-level, Apache-compatible interface with a robust and easy-to-use API to interact with Apache Kafka.

It is built on top of the librdkafka C library which is a highly performant and reliable library for Apache Kafka clients.

Throughout this post, you will discover the basics of connecting to a Kafka broker which runs on local machine, how to create a new producer, and create write messages to an existing topic.

Requirements:

To follow along with this tutorial, you need to have the basic understanding of Apache Kafka, producers, consumers, and Kafka topics. You also need the basic Python understanding.

We also assume that you have the latest version of the Python and Apache Kafka setup on your machine.

Project Setup

Since this is a basic tutorial, all we need is a single Python file to store the source code for the Python consumer. We can create a directory to store the file.

$ mkdir ~/projects/kafka_py

Change into the project’s directory and create the source file.

$ cd ~/projects/kafka_py && touch producer.py

Edit the file to store the source code with your text editor of choice.

$ vim producer.py

Before proceeding further, install the kafka-python package using PIP as shown in the following command:

$ pip install confluent-kafka

Once installed, edit the source file and add the source code as shown in the following:

from confluent_kafka import Producer

def delivery_report(err, msg):
    if err is not None:
        print('Even write fail : {}'.format(err))
    else:
        print('Event produced {} [{}]'.format(msg.topic(), msg.partition()))

# kafka configuration
conf = {
    'bootstrap.servers': 'localhost:9092'
}

# kafka producer init
producer = Producer(conf)

# Read data from the user_data.txt file
with open("user_data.txt", "r") as file:
    for line in file:
        # write messages to the users topic
        producer.produce("users", key=None, value=line.encode('utf-8'), callback=delivery_report)

# Wait for any outstanding messages to be delivered and delivery report callbacks
producer.flush()

The previous code reads the “user_data.txt” file and writes the content of the file to the “users” topic in Kafka.

The sample user_data.txt data is as shown in the following:

Leticia,lelliman0@answers.com,228.112.209.31
Karna,keldershaw1@photobucket.com,61.229.47.131
Hedda,hdifranceschi2@de.vu,103.28.17.160
Bat,bsive3@issuu.com,69.87.104.4
Dame,dwilce4@meetup.com,19.12.21.58

Running the previous code should read the file and write the topics to the topic.

Conclusion

We hope you enjoyed this post that shows the basics of reading a file and writing the contents of the file to a Kafka topic.

Share Button

Source: linuxhint.com

Leave a Reply