This project demonstrates the use of Apache Kafka by integrating a simple Python-based frontend application with a Kafka consumer service.
- The frontend, developed in Python (Streamlit), produces and sends messages to Kafka.
- A consumer, also implemented in Python, subscribes to the Kafka topic, processes the incoming messages, and writes the data into Amazon S3 / Minio.
The goal is to provide a practical example of how Kafka can be used for message streaming between applications and cloud storage.
Run the following command in the project root:
docker-compose up -dThis will start:
Check Kafka-UI at: http://localhost:9007
- Docker and Docker Compose installed
- Python 3.8+
Install Python dependencies:
pip install -r requirements.txt- streamlit application to generate data
- Sends messages to Kafka
streamlit run app_producer.py
link: http://localhost:8501
- This script consumes messages from Kafka and stores the data in S3.
python consumer_s3.py
link: http://localhost:9000
Login: chapolin
Password: mudar@123
https://kafka.apache.org/documentation
https://hub.docker.com/r/obsidiandynamics/kafka
https://hub.docker.com/r/provectuslabs/kafka-ui
https://github.com/dpkp/kafka-python
https://faker.readthedocs.io/en/master/
https://whimsical.com/kafka-EbWjeGL3gDg9apxewMyGhB
https://softwaremill.com/kafka-visualisation/
| Developer | Portfolio | ||
|---|---|---|---|
| Wallace Camargo | wallacecpdg@gmail.com | Portfolio |