This repository is the Example Project of Kafka implementation in NodeJs.
This project works on following npm packages:
kafkajs: For connecting to and using Kafkaavsc: For fast and compact JSON data seriolization@faker-js/faker: For generating random data
Additionally
For this example I am using Docker for running Kafka locally, you can do so by simply installing docker in your local by switching into the code directory and run the cmd
docker-compose up -dIf all this setup is confusing, you can simply use Kafka as a Service provider such as Confluent Cloud
- Please star and fork the repo
- Then clone the repo in the local environment using the following command in the Terminal:
git clone https://github.com/prashant1k99/kafka-example.git- Install the dependencies with the following command:
npm install- Create Kafka topics by running:
npm run start:admin
- Start Kafka Topic consumer by running:
npm run start:consumer
- Trigger Kafka by executing producer by running:
npm run start:producer
- util.js : This file contains the function to generate random user information.
- kafka.js : This file contains code for setting up connection with
Kafkaservice. - docker-compose.yml : This is the
ymlconfiguration file to startKafkausingDocker - admin.js : This file contians logic for creating Kafka topics.
- schema
user.schema.js: This contains schema for User Type which is passed as Data. This code is responsible for Encoding and Decoding UserData for sending and receiving data via Kafka. Kafka by default only supports Buffer and string data.
- producer
index.js: This file contians the code for producing data to Kafka. After connecting toKafka, it usesproducer.send()for sending data.
- consumer
index.js: In this file the code is for subscribing to the topic usingconsumer.subscribe()then you'll be able to get all the events triggered by the producer for subscribed topic.
For more detail please read the documentation of Kafka
