Logo

dev-resources.site

for different kinds of informations.

Breaking the Scale Barrier: 1 Million Messages with NodeJS and Kafka

Published at
1/15/2025
Categories
node
microservices
pubsub
kafka
Author
fahim_hasnainfahad_7e50d
Categories
4 categories in total
node
open
microservices
open
pubsub
open
kafka
open
Author
24 person written this
fahim_hasnainfahad_7e50d
open
Breaking the Scale Barrier: 1 Million Messages with NodeJS and Kafka

Apache Kafka is a distributed event streaming platform which can be used as a Pub-Sub broker in NodeJS microservices. Recently, I worked in a project where publisher needed to send messages to 1 million users and give the response back. Here, a cool thing can be done by Message Queue and we can send 1 million messages in a queue and it will send the messages to the users one by one. But, a pub-sub can broadcast to all instances of a consumer which are subscribed to a certain topic.

The Magic of Message Queues vs. Pub-Sub in Kafka

When faced with such a massive scale of messaging, two approaches often come to mind:

  1. Message Queue: Send 1 million messages to a queue, where each message is processed one by one by consumers.

  2. Pub-Sub: Broadcast a single message to all instances of consumers subscribed to a specific topic.

For our use case, Kafka’s flexibility in handling both patterns made it the ideal choice. Let’s dive into how to implement a Producer-Consumer microservice system with Kafka.

Check out the demo code: https://github.com/fahadfahim13/producer-consumer-kafka.git

Architecture

NodeJS Microservice with Kafka

Producer Microservice:

Let's create a Producer microservice and a Consumer microservice.
Code in producer/index.js will be like this:

const { Kafka } = require('kafkajs');

const kafka = new Kafka({
  clientId: 'my-producer',
  brokers: [process.env.KAFKA_BROKER],
});

const producer = kafka.producer();

(async () => {
  await producer.connect();
  for (let i = 0; i < 1000000; i++) {
    await producer.send({
      topic: process.env.TOPIC_NAME,
      messages: [{ value: `Message ${i}` }],
    });
    console.log(`Sent message ${i}`);
  }
  await producer.disconnect();
})();
Enter fullscreen mode Exit fullscreen mode

Here, we can see the producer is producing 1 million messages and sending into a Kafka topic.

Consumer Microservice

Code in consumer microservice's index.js file will be like this:

const { Kafka } = require('kafkajs');

const kafka = new Kafka({
  clientId: 'my-consumer',
  brokers: [process.env.KAFKA_BROKER],
});

const consumer = kafka.consumer({ groupId: 'test-group' });

(async () => {
  await consumer.connect();
  await consumer.subscribe({ topic: process.env.TOPIC_NAME, fromBeginning: true });

  await consumer.run({
    eachMessage: async ({ topic, partition, message }) => {
      console.log(`Received message: ${message.value} in topic: ${topic} at partition: ${partition}`);
    },
  });
})();
Enter fullscreen mode Exit fullscreen mode

The consumers will be subscribed to a Kafka topic and receive messages asynchronously and process those messages.

Docker-compose File

The docker-compose.yml file looks like this which will run zookeeper, kafka and producer & consumer services in different containers.

version: '3.8'
services:
  zookeeper:
    image: confluentinc/cp-zookeeper:latest
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
    ports:
      - "22181:2181"
    networks:
      - kafka-network
    volumes:
      - kafka_data:/var/lib/kafka/data

  kafka:
    image: confluentinc/cp-kafka:latest
    depends_on:
      - zookeeper
    ports:
      - "29092:29092"
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:29092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
    networks:
      - kafka-network

  producer:
    build: 
      context: ./producer
      dockerfile: Dockerfile
    container_name: kafka-producer
    depends_on:
      - kafka
    networks:
      - kafka-network
    volumes:
      - ./producer:/app
    environment:
      KAFKA_BOOTSTRAP_SERVERS: kafka:9092
      KAFKA_BROKER: kafka:9092
      TOPIC_NAME: my-topic

  consumer:
    build: 
      context: ./consumer
      dockerfile: Dockerfile
    container_name: kafka-consumer
    depends_on:
      - kafka
    networks:
      - kafka-network
    volumes:
      - ./consumer:/app
    environment:
      KAFKA_BOOTSTRAP_SERVERS: kafka:9092
      KAFKA_BROKER: kafka:9092
      TOPIC_NAME: my-topic

networks:
  kafka-network:
    driver: bridge


volumes:
  kafka_data:
Enter fullscreen mode Exit fullscreen mode

Optimizing Kafka Performance with Partitions

To handle high throughput, it’s essential to increase the number of partitions in our Kafka topic. More partitions allow parallel processing, enabling producers and consumers to scale horizontally. For example:

kafka-topics --create \
  --zookeeper localhost:2181 \
  --replication-factor 1 \
  --partitions 10 \
  --topic my-topic
Enter fullscreen mode Exit fullscreen mode

Each partition acts as a separate queue, distributing the load among consumers.

Check out the demo code: https://github.com/fahadfahim13/producer-consumer-kafka.git

node Article's
30 articles in total
Favicon
Breaking the Scale Barrier: 1 Million Messages with NodeJS and Kafka
Favicon
assert in Nodejs and its usage in Grida source code
Favicon
Understanding Node.js Cluster: The Core Concepts
Favicon
🌟 A New Adventure Begins! 🛵🍕
Favicon
How “Silicon Valley” Inspired Me to Create a Photo Compressor CLI for Web Developers
Favicon
How Does This Jewelry Customization Site Work? Need Insights!
Favicon
Building a Secure Authentication API with TypeScript, Node.js, and MongoDB
Favicon
Understanding OAuth 1.0a Signature Generation: Postman vs. Node.js Library and Custom Implementation
Favicon
How to Fix the “Record to Delete Does Not Exist” Error in Prisma
Favicon
[Boost]
Favicon
Run NextJS App in shared-hosting cPanel domain!
Favicon
Construindo uma API segura e eficiente com @fastify/jwt e @fastify/mongodb
Favicon
New ways to engage with the stdlib community!
Favicon
Sore Throat: Causes, Symptoms, and Treatment
Favicon
Back to MonDEV 2025
Favicon
🌟 How to Fix Node.js Path Issues in VS Code (Step-by-Step Guide)
Favicon
How to write unit tests and E2E tests for NestJS applications
Favicon
Cookies auto clearing after browser refresh issue , CORS related express cookies issue
Favicon
Exploring TypeScript Support in Node.js v23.6.0
Favicon
Mastering Backend Node.js Folder Structure, A Beginner’s Guide
Favicon
how to setup express api from scratch
Favicon
Load Balancing Node.js Applications with Nginx Upstream Configuration
Favicon
Using LRU Cache in Node.js and TypeScript
Favicon
Welcome to Siitecch! Your Go-To Platform for Beginner-Friendly Full-Stack JavaScript Learning.
Favicon
I Really like Middleware in NodeJs/Express.
Favicon
Your own Telegram bot on NodeJS with TypeScript, Telegraf and Fastify (Part 3)
Favicon
Understanding Node.js Cluster: The Core Concepts
Favicon
JWT Authentication With NodeJS
Favicon
Stream de Arquivo PDF ou Imagem S3 - AWS
Favicon
Understanding Node.js Alpine Versions: A Lightweight Choice for Your Projects

Featured ones: