Skip to content
This repository has been archived by the owner on Feb 4, 2022. It is now read-only.

A POC of a Kafka pipeline with Avro4s and schema registry

Notifications You must be signed in to change notification settings

dixahq/kafka-pipeline-with-avro4s-POC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Proof of Concept: A Kafka Pipeline with Avro4s and Confluent Schema Registry

This is a POC that contains a Kafka Consumer and a Kafka Producer. The messages are being automatically de/serialized using a library Avro4s The communication between a producer and a consumer is using a Confluent schema registry to validate the protocols

Prerequisites

Deployment

  • Start zookeeper sever. Run from the Zookeeper root folder:
bin/zookeeper-server-start.sh config/zookeeper.properties
  • Start Kafka server. Run from the Kafka root folder:
bin/kafka-server-start.sh config/server.properties
  • Start Schema Registry. Run from the Confluent Platform root:
bin/schema-registry-start etc/schema-registry/schema-registry.properties
  • Create a new topic "my_topic". Run from the Kafka root:
/bin/kafka-topics.sh --create \
    --zookeeper <hostname>:<port> \
    --topic my_topic \
    --partitions <number-of-partitions> \
    --replication-factor <number-of-replicating-servers>

Verifying

You can use Kafka console to verify that consumer and producer work as expected:

bin/kafka-console-producer.sh --broker-list localhost:9092 --topic my_topic
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my_topic --from-beginning

Generating Avro files with Avro4s:

Run from the producer root:

sbt generateAvro

Deploy Schema Registry

After the avro file is generated, you need to deploy it using the Confluent Schema Registry API

About

A POC of a Kafka pipeline with Avro4s and schema registry

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages