
This repository provides a hands-on quickstart guide to implementing Apache Kafka's publish-subscribe messaging pattern using Spring Boot. It demonstrates how to set up separate producer (publisher) and consumer applications, communicate through a shared data transfer object, and manage Kafka locally.
- Apache Kafka Fundamentals: Understand topics, producers, and consumers.
- Spring for Apache Kafka: Seamless integration of Kafka messaging into Spring Boot applications.
- Publish-Subscribe Pattern: Clear separation of concerns between message publishers and consumers.
- Asynchronous Communication: Real-time data processing without blocking.
- Loose Coupling: Independent services communicating via events.
- Shared DTO Module: Best practice for defining common data structures across services.
- Local Kafka Setup: Instructions for running Kafka using both Open Source Apache Kafka and Confluent Community Edition.
To get this project up and running locally, follow these steps.
Before you begin, ensure you have the following installed:
- Java Development Kit (JDK) 17 or higher
- Maven 3.6.0 or higher (or Gradle if you prefer, though this project uses Maven)
- Apache Kafka: You'll need a running Kafka instance. Instructions for local setup are provided below.
You have two main options for running Kafka locally, for this project i followed the option 01:
- Download Kafka: Visit the Apache Kafka downloads page and download the latest stable release.
- Extract the Archive:
Unzip the downloaded
kafka_*.tgz
file to a directory of your choice (e.g.,C:\kafka
,~/kafka
). - Start Zookeeper:
Kafka relies on Zookeeper. Open a new terminal in your Kafka installation directory and run:
# For Windows .\bin\windows\zookeeper-server-start.bat .\config\zookeeper.properties # For macOS/Linux ./bin/zookeeper-server-start.sh ./config/zookeeper.properties
- Start Kafka Server:
Open another new terminal in your Kafka installation directory and run:
# For Windows .\bin\windows\kafka-server-start.bat .\config\server.properties # For macOS/Linux ./bin/kafka-server-start.sh ./config/server.properties
Confluent Community Edition simplifies local Kafka setup significantly.
- Download Confluent CLI: Follow the instructions on the Confluent documentation to install the Confluent CLI.
- Start Confluent Platform (including Kafka and Zookeeper):
Open a terminal and run:
This command will start Zookeeper, Kafka, Schema Registry, and other components.
confluent local services start
Once Kafka is running, you need to create the topic(s) that your publisher will write to and your consumer will read from.
For this project, let's assume we'll use a topic named orders-topic
.
# Using Apache Kafka CLI (from your Kafka installation directory)
# For Windows
.\bin\windows\kafka-topics.bat --create --topic orders-topic --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1
# For macOS/Linux
./bin/kafka-topics.sh --create --topic orders-topic --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1
# Using Confluent CLI
confluent kafka topic create orders-topic
You can verify the topic creation:
# Apache Kafka CLI
# For Windows
.\bin\windows\kafka-topics.bat --describe --topic orders-topic --bootstrap-server localhost:9092
# For macOS/Linux
./bin/kafka-topics.sh --describe --topic orders-topic --bootstrap-server localhost:9092
# Confluent CLI
confluent kafka topic describe orders-topic
Each component (consumer
, publisher
, shared-dto
) is a separate Maven module.
-
Clone the Repository:
git clone [https://github.com/moshdev2213/springboot-kafka-pubsub-quickstart.git](https://github.com/moshdev2213/springboot-kafka-pubsub-quickstart.git) cd springboot-kafka-pubsub-quickstart
-
Build the Project: Build all modules to ensure dependencies are resolved and
shared-dto
is available forpublisher
andconsumer
.mvn clean install
-
Run the Consumer Application: Open a new terminal, navigate to the
consumer
directory, and run the application.cd consumer mvn spring-boot:run
The consumer will start listening for messages on the
orders-topic
. -
Run the Publisher Application: Open another new terminal, navigate to the
publisher
directory, and run the application.cd publisher mvn spring-boot:run
The publisher application will expose an endpoint to send messages. Check the
publisher
module's Controller code for how to trigger message sending.

- The
publisher
application is responsible for creating and sending messages (events) to the Kafkaorders-topic
. These messages typically represent an "order" or a similar business event. - The
consumer
application continuously listens to theorders-topic
. Whenever a new message arrives, the consumer processes it (e.g., logs it, stores it in a database, triggers another business logic). - The
shared-dto
module defines the common data structure (e.g.,OrderDTO
) that both the publisher sends and the consumer expects, ensuring type safety and consistency across the message flow.
springboot-kafka-pubsub-quickstart/
βββ consumer/ # Spring Boot application for consuming Kafka messages
β βββ src/main/java/.../ConsumerApplication.java
β βββ ...
βββ publisher/ # Spring Boot application for publishing Kafka messages
β βββ src/main/java/.../PublisherApplication.java
β βββ ...
βββ shared-dto/ # Common Data Transfer Objects (DTOs) shared by consumer and publisher
β βββ src/main/java/.../OrderDTO.java
β βββ ...
βββ README.md # This file
- Java 17+
- Spring Boot 3.x
- Apache Kafka
- Maven
This quickstart is just the beginning! We have exciting plans to enhance this project:
- Dockerization: Containerize both the
publisher
andconsumer
Spring Boot applications using Docker, making them portable and easy to deploy. - Docker Compose Integration: Introduce a
docker-compose.yml
file to orchestrate the entire environment, including Kafka, Zookeeper, and our Spring Boot applications, for simplified local development and testing. - CI/CD Pipeline: Implement a Continuous Integration/Continuous Deployment (CI/CD) pipeline (e.g., using GitHub Actions, Jenkins, GitLab CI) to automate testing, building, and deployment processes.
- Cloud Deployment: Explore deploying the Dockerized applications to a cloud environment like an AWS EC2 instance or any other Virtual Machine (VM) running Docker, demonstrating a real-world deployment scenario.
Contributions are welcome! If you have suggestions for improvements, new features, or bug fixes, please feel free to:
- Fork the repository.
- Create a new branch (
git checkout -b feature/your-feature-name
). - Make your changes.
- Commit your changes (
git commit -m 'feat: Add new feature'
). - Push to the branch (
git push origin feature/your-feature-name
). - Open a Pull Request.
- Apache Kafka Community
- Spring Framework Contributors