Skip to content

This repository offers a Spring Boot Kafka Pub/Sub quickstart πŸš€, demonstrating asynchronous, loosely coupled communication ✨. It provides separate publisher and consumer services with shared DTOs, plus local Kafka setup instructions πŸ’». Future plans include Dockerization 🐳, Docker Compose, CI/CD integration πŸš€, and cloud deployment to EC2/VMs ☁️.

Notifications You must be signed in to change notification settings

moshdev2213/springboot-kafka-pubsub

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

15 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Spring Boot, Kafka Quickstart

Image

🌟 Features & Concepts Demonstrated

This repository provides a hands-on quickstart guide to implementing Apache Kafka's publish-subscribe messaging pattern using Spring Boot. It demonstrates how to set up separate producer (publisher) and consumer applications, communicate through a shared data transfer object, and manage Kafka locally.

  • Apache Kafka Fundamentals: Understand topics, producers, and consumers.
  • Spring for Apache Kafka: Seamless integration of Kafka messaging into Spring Boot applications.
  • Publish-Subscribe Pattern: Clear separation of concerns between message publishers and consumers.
  • Asynchronous Communication: Real-time data processing without blocking.
  • Loose Coupling: Independent services communicating via events.
  • Shared DTO Module: Best practice for defining common data structures across services.
  • Local Kafka Setup: Instructions for running Kafka using both Open Source Apache Kafka and Confluent Community Edition.

πŸš€ Getting Started

To get this project up and running locally, follow these steps.

Prerequisites

Before you begin, ensure you have the following installed:

  • Java Development Kit (JDK) 17 or higher
  • Maven 3.6.0 or higher (or Gradle if you prefer, though this project uses Maven)
  • Apache Kafka: You'll need a running Kafka instance. Instructions for local setup are provided below.

βš™οΈ Setting Up Kafka Locally

You have two main options for running Kafka locally, for this project i followed the option 01:

Option 1: Open Source Apache Kafka

  1. Download Kafka: Visit the Apache Kafka downloads page and download the latest stable release.
  2. Extract the Archive: Unzip the downloaded kafka_*.tgz file to a directory of your choice (e.g., C:\kafka, ~/kafka).
  3. Start Zookeeper: Kafka relies on Zookeeper. Open a new terminal in your Kafka installation directory and run:
    # For Windows
    .\bin\windows\zookeeper-server-start.bat .\config\zookeeper.properties
    
    # For macOS/Linux
    ./bin/zookeeper-server-start.sh ./config/zookeeper.properties
  4. Start Kafka Server: Open another new terminal in your Kafka installation directory and run:
    # For Windows
    .\bin\windows\kafka-server-start.bat .\config\server.properties
    
    # For macOS/Linux
    ./bin/kafka-server-start.sh ./config/server.properties

Option 2: Confluent Community Edition (Recommended for ease of use)

Confluent Community Edition simplifies local Kafka setup significantly.

  1. Download Confluent CLI: Follow the instructions on the Confluent documentation to install the Confluent CLI.
  2. Start Confluent Platform (including Kafka and Zookeeper): Open a terminal and run:
    confluent local services start
    This command will start Zookeeper, Kafka, Schema Registry, and other components.

πŸ“ Creating Kafka Topic(s)

Once Kafka is running, you need to create the topic(s) that your publisher will write to and your consumer will read from.

For this project, let's assume we'll use a topic named orders-topic.

# Using Apache Kafka CLI (from your Kafka installation directory)
# For Windows
.\bin\windows\kafka-topics.bat --create --topic orders-topic --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1

# For macOS/Linux
./bin/kafka-topics.sh --create --topic orders-topic --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1

# Using Confluent CLI
confluent kafka topic create orders-topic

You can verify the topic creation:

# Apache Kafka CLI
# For Windows
.\bin\windows\kafka-topics.bat --describe --topic orders-topic --bootstrap-server localhost:9092

# For macOS/Linux
./bin/kafka-topics.sh --describe --topic orders-topic --bootstrap-server localhost:9092

# Confluent CLI
confluent kafka topic describe orders-topic

πŸƒ Running the Spring Boot Applications

Each component (consumer, publisher, shared-dto) is a separate Maven module.

  1. Clone the Repository:

    git clone [https://github.com/moshdev2213/springboot-kafka-pubsub-quickstart.git](https://github.com/moshdev2213/springboot-kafka-pubsub-quickstart.git)
    cd springboot-kafka-pubsub-quickstart
  2. Build the Project: Build all modules to ensure dependencies are resolved and shared-dto is available for publisher and consumer.

    mvn clean install
  3. Run the Consumer Application: Open a new terminal, navigate to the consumer directory, and run the application.

    cd consumer
    mvn spring-boot:run

    The consumer will start listening for messages on the orders-topic.

  4. Run the Publisher Application: Open another new terminal, navigate to the publisher directory, and run the application.

    cd publisher
    mvn spring-boot:run

    The publisher application will expose an endpoint to send messages. Check the publisher module's Controller code for how to trigger message sending.


πŸ’‘ How It Works


Image
  • The publisher application is responsible for creating and sending messages (events) to the Kafka orders-topic. These messages typically represent an "order" or a similar business event.
  • The consumer application continuously listens to the orders-topic. Whenever a new message arrives, the consumer processes it (e.g., logs it, stores it in a database, triggers another business logic).
  • The shared-dto module defines the common data structure (e.g., OrderDTO) that both the publisher sends and the consumer expects, ensuring type safety and consistency across the message flow.

πŸ“‚ Project Structure

springboot-kafka-pubsub-quickstart/
β”œβ”€β”€ consumer/                 # Spring Boot application for consuming Kafka messages
β”‚   └── src/main/java/.../ConsumerApplication.java
β”‚   └── ...
β”œβ”€β”€ publisher/                # Spring Boot application for publishing Kafka messages
β”‚   └── src/main/java/.../PublisherApplication.java
β”‚   └── ...
β”œβ”€β”€ shared-dto/               # Common Data Transfer Objects (DTOs) shared by consumer and publisher
β”‚   └── src/main/java/.../OrderDTO.java
β”‚   └── ...
└── README.md                 # This file

πŸ› οΈ Technologies Used

  • Java 17+
  • Spring Boot 3.x
  • Apache Kafka
  • Maven

πŸš€ What's Next? (Future Plans)

This quickstart is just the beginning! We have exciting plans to enhance this project:

  • Dockerization: Containerize both the publisher and consumer Spring Boot applications using Docker, making them portable and easy to deploy.
  • Docker Compose Integration: Introduce a docker-compose.yml file to orchestrate the entire environment, including Kafka, Zookeeper, and our Spring Boot applications, for simplified local development and testing.
  • CI/CD Pipeline: Implement a Continuous Integration/Continuous Deployment (CI/CD) pipeline (e.g., using GitHub Actions, Jenkins, GitLab CI) to automate testing, building, and deployment processes.
  • Cloud Deployment: Explore deploying the Dockerized applications to a cloud environment like an AWS EC2 instance or any other Virtual Machine (VM) running Docker, demonstrating a real-world deployment scenario.

🀝 Contributing

Contributions are welcome! If you have suggestions for improvements, new features, or bug fixes, please feel free to:

  1. Fork the repository.
  2. Create a new branch (git checkout -b feature/your-feature-name).
  3. Make your changes.
  4. Commit your changes (git commit -m 'feat: Add new feature').
  5. Push to the branch (git push origin feature/your-feature-name).
  6. Open a Pull Request.

πŸ™ Acknowledgements

  • Apache Kafka Community
  • Spring Framework Contributors

About

This repository offers a Spring Boot Kafka Pub/Sub quickstart πŸš€, demonstrating asynchronous, loosely coupled communication ✨. It provides separate publisher and consumer services with shared DTOs, plus local Kafka setup instructions πŸ’». Future plans include Dockerization 🐳, Docker Compose, CI/CD integration πŸš€, and cloud deployment to EC2/VMs ☁️.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages