Skip to content

Latest commit

 

History

History
86 lines (65 loc) · 2.21 KB

README.md

File metadata and controls

86 lines (65 loc) · 2.21 KB

Getting Started with Kafka-Docker-Project

Apache Kafka is a distributed data store optimized for ingesting and processing streaming data in real-time.
The Kafka Producer API allows applications to send streams of data to the Kafka cluster.
The Kafka Consumer API allows applications to read streams of data from the cluster.

kafka-docker

============

Dockerfile for Apache Kafka
The image is available directly from Docker Hub

Usage

Start a cluster:
docker-compose -f docker-compose-expose.yml up -d

Stop a cluster:
docker-compose stop

Connect Kafka Cluster

Connect kafka cluster with kafka tools.
Download kafka tools: https://www.kafkatool.com/download.html

Configure kafka tools to connect with kafka cluster.
1
2

Install Library for Producer/Consumer Python App

pip install kafka-python

Run Producer App Example

Example Producer App create with python.
To start produce data go to /kafka-producer directory and run:

python producer.py or py producer.py

3

View Data

View produced data with kafka tools.

4
5

Run Consumer App Example

Example Consumer App create with python.
To start consume go to /kafka-consumer directory and run:

python consumer.py or py consumer.py

6

Tutorial & References

https://towardsdatascience.com/kafka-docker-python-408baf0e1088
http://wurstmeister.github.io/kafka-docker