Preparation for Kafka Streams API hands-on practice lab
In order not to waste precious time during the training on downloading hundreds of megabytes from the Internet and solving technical problems, please carefully read and fully follow this instruction before the training starts.
Your machine should have:
- Webcam and microphone (participation in the training takes place with the webcam turned on and will require periodic screen sharing!)
- Docker,
- Java 8/11,
- Maven,
- IntelliJ IDEA (with Lombok plugin).
You may use a more recent version of Java or other IDE (more familiar to you), but with no guarantee that I can help in case of problems.
-
Clone the project https://github.com/inponomarev/kstreams-class to your local drive
-
Run
docker-compose upat the root of the project. This command will download the necessary Docker images and start a mini Kafka cluster on your machine. -
Check the health of Kafka and kcat. To do this, log into the docker container
docker exec -it kafkacat /bin/bash
and run the command
kcat -L -b broker:29092
If you see text like "Metadata for all topics... broker1 at localhost:9092", then everything is fine with Kafka and kcat running on your machine!
-
Run
mvn clean installat the root of the project. This command will download and cache the required libraries. The build should succeed! -
Open the project in IntelliJ IDEA. If necessary, install the Lombok plugin (Shift-Shift, Plugins, find and install the Lombok plugin). Once the import and indexing is complete, there should be no "red underlined" code.
-
Make sure you can run the Producer module (as in the screenshot) and it starts producing data into your locally deployed Kafka (below, the log runs as in the screenshot):
Congratulations, you are now fully prepared for the training!
