Apache Kafka is a distributed data store optimized for ingesting and processing streaming data in real-time.
The Kafka Producer API allows applications to send streams of data to the Kafka cluster.
The Kafka Consumer API allows applications to read streams of data from the cluster.
============
Dockerfile for Apache Kafka
The image is available directly from Docker Hub
Start a cluster:
docker-compose -f docker-compose-expose.yml up -d
Stop a cluster:
docker-compose stop
Connect kafka cluster with kafka tools.
Download kafka tools:
https://www.kafkatool.com/download.html
Configure kafka tools to connect with kafka cluster.
pip install kafka-python
Example Producer App create with python.
To start produce data go to /kafka-producer directory and run:
python producer.py or py producer.py
View produced data with kafka tools.
Example Consumer App create with python.
To start consume go to /kafka-consumer directory and run:
python consumer.py or py consumer.py
https://towardsdatascience.com/kafka-docker-python-408baf0e1088
http://wurstmeister.github.io/kafka-docker