Message and Event Processing with Kafka (1) - Kafka Setup
This post was written while reading the book Kafka, The Definitive Guide. For simple execution and example programming, I referred to the official documentation rather than the book.
Apache Kafka

Apache Kafka is a distributed data streaming platform that can publish, subscribe, store, and process record streams in real-time. It is designed to handle data streams from multiple sources and deliver them to multiple consumers. In simple terms, it can move large amounts of data simultaneously not just from point A to point B, but from point A to point B and everywhere else it’s needed.
Key Concepts to Understand When Using Kafka
-
ZooKeeper
Apache Zookeeper is an open-source project that provides distributed coordination services.
Instead of applications handling scheduling and task coordination directly, Zookeeper helps with coordination. Clusters are built for stability, and clusters are typically built with an odd number of nodes.
-
Topic
Think of it as the ‘subject’ of data. You create a data subject/event as a Topic and can send and read data from that Topic.
For example, you can set a Topic as ‘temperature’ and send and read data related to the temperature Topic.
-
Producer
The side that provides data. You could also call it “the talker.” It’s the side that generates data and creates events. It provides data to the configured Topic.
-
Consumer
The side that consumes data. In other words, the side that needs the data. It can also serve as a bridge to analyze or store data and events as they occur.
Installing and Running Kafka
Since there are many services to run and verify, you’ll need to open quite a few terminal windows.
Download
https://www.apache.org/dyn/closer.cgi?path=/kafka/2.8.0/kafka_2.13-2.8.0.tgz
Download Kafka from the page above and move it to an appropriate location.
Unzip
tar -xzf kafka_2.13-2.8.0.tgz
cd kafka_2.13-2.8.0
Running Zookeeper and Broker for Kafka Execution
// Run from inside the kafka_2.13-2.8.0 folder
// zookeeper
$ bin/zookeeper-server-start.sh config/zookeeper.properties
// broker service
$ bin/kafka-server-start.sh config/server.properties
Creating a Topic for Message Delivery
// Create Kafka myevent Topic
$ bin/kafka-topics.sh --create --topic myevent --bootstrap-server localhost:9092
// Check myevent Topic subscription status
$ bin/kafka-topics.sh --describe --topic myevent --bootstrap-server localhost:9092
Sending Messages to a Topic (from command line interface)
$ bin/kafka-console-producer.sh --topic myevent --bootstrap-server localhost:9092
>
Running kafka-console-producer.sh continues to generate and send messages. Therefore, the console window stays open in an input-ready state.
Reading Messages Sent to a Topic (from command line interface)
bin/kafka-console-consumer.sh --topic myevent --from-beginning --bootstrap-server localhost:9092
The data written in kafka-console-producer.sh above will be displayed.
댓글남기기