Setup a Kafka broker on Docker
As you might know that Apache Kafka is a distributed event store and stream-processing platform. You can use it as a message broker based upon publish-subscribe principle and from there you can subscribe to it, and publish data to any number of systems or real-time applications. However, other popular use cases also include using it as a durable storage where it can act as a ‘source of truth’ or/and an event streaming platform where you can manipulate data as it comes in.
Setting up Kafka broker in your local
In order to setup Kafka locally, you would need to follow the steps mentioned in the link below:
https://developer.confluent.io/quickstart/kafka-local
It is a very straight forward way.
Setting up Kafka broker on Docker
These days, instead of manually downloading a bunch of compressed files, extracting its content and then installing locally, developers prefer to dockerize most of the things which is pretty fast, reliable and easy to manage.
Docker compose is the trend these days as it is a tool that can help you manage a multi-container service very easily, often with just a single command. For using docker compose, we need to define a YAML file that contains the configuration for different containers that you would like to run…