Skip to main content

[Short Blog] Apache Kafka consumer with WSO2 Micro integrator

 This document contains steps that I have followed to setup inbound endpoint with Kafka Avro messages.

Setting up Kafka server and UI

First download theKafka kafka_2.11–2.2.1 for Zookeeper and Kafka server. https://archive.apache.org/dist/kafka/2.2.1/kafka_2.11-2.2.1.tgz

Start Zookeeper with kafka_2.11–2.2.1. Go to the kafka home folder and execute:

The same way, start Kafka with :

Download confluent-5.2.1 for Schema registry https://packages.confluent.io/archive/5.2/confluent-community-5.2.1-2.11.tar.gz

Start Schema registry with following command on the confluent home folder:

As the UI for Kafka I used https://github.com/provectus/kafka-ui. Optionaly you can use it to confirm messages are publishing correctly. You can use prebuilt jar file and start the UI with following command

You can access the UI with http://localhost:8080/ . There, you can check topics, messages, consumers and modify or create them accordingly.

Download kafka-ui jar from https://github.com/provectus/kafka-ui/releases/download/0.3.3/kafka-ui-0.3.3.jar

Use the following application-local.yml file:

Building MI Kafka Inbound endpoint

Copy following jar files from <Kafka Home>/lib to the <MI Home>/lib folder:

  • kafka_2.11–2.2.1.jar
  • kafka-clients-2.2.1.jar
  • metrics-core-2.2.0.jar
  • scala-library-2.11.12.jar
  • zkclient-0.11.jar
  • zookeeper-3.4.13.jar

Copy following jars from maven repository to the <MI Home>/lib

  • jackson-core-asl-1.9.13.jar
  • jackson-mapper-asl-1.9.13
  • common-config-5.4.0.jar
  • common-utils-5.4.0.jar
  • kafka-avro-serializer-5.3.0.jar
  • kafka-schema-registry-client-5.3.0.jar
  • avro-1.8.1.jar

Download Kafka inbound endpoint from WSO2 connector store https://store.wso2.com/store/assets/esbconnector/details/b15e9612-5144-4c97-a3f0-179ea583be88 (Download inbound endpoint, not the connector) and copy it inside MI/lib.

Copy the following inbound endpoint into the <MI Home>/repository/deployment/server/synapse-configs/defaults/inbound-endpoints/

Testing Inbound endpoint

Create a new topic in Kafka by executing the following in <Kafka home> folder:

Lets use prebuilt client to generate Avro messages and publish them on the Kafka topic. Clone the https://github.com/datastax/kafka-examples repo go to the producers folder. Execute following command to generate avro messages:

In MI, You can see that MI consume Avro messages in print it on console.

Comments

Post a Comment

Popular posts from this blog

Database Internel Architecture: SQLite

Introduction A database is an essential part of building a software system which used to store and read data efficiently. Here, We are going to discuss some architectural details of database implementation by using an early version of SQLite. SQLite is a small database application which used in millions of software and devices. SQLite invented by D.Richard Hipp in August 2000. SQLite is a high performance, lightweight relational database. If you are willing to learn internal of a database in coding level, then SQLite is the best open source database available out there with highly readable source code with lots of documentation. Reading later versions of SQLite become a little harder since it contains lots of new features. In order to understand the basic implementation of database internals, You should have good knowledge about data structures, some knowledge about Theory of Computing and how an operating system works. Here we are looking into the SQLite 2.5.0 version. Here

Weird Programming Languages

There are thousands of programming languages are invented and only about hundred of programming languages are commonly used to build software. Among this thousands of programming languages, there are some weird type of programming languages can be also found. These programming languages are seems to be called weird, since their programming syntax and the way it represent its code. In this blog we will look into some of these language syntax. Legit Have you ever wonder, when you come to a Github project that print hello world program, but you cannot see any codes or any content. Check this link  https://github.com/blinry/legit-hello  and you will see nothing in this repository. But trust me, there is hidden code in this project. If you see the  commit  section, you can reveal the magic. Yeah, you are right. Its storing hello world code in inside the git commit history. If you clone this project and run the following command, then you can see the hidden code in this project. g

Basic Concepts of the Kubernetes

Handling large software which has multiple services is a tedious, time-consuming task for DevOps engineer. Microservices comes into the rescue DevOps engineers from all these complicated deployment processes. Simply, each microservice in the system has it own responsibility to handle one specific task. The container can be used to deploy each of these micro-tasks as a unit of service. If you are not that familiar with Containers, read this article to get to know about Docker, Which is the most popular and widely used container technology to deploy microservices. As I described early, we can use single container to deploy a single service and container contain all required configurations and dependencies. Single service always faces a common problem of a single point of failure. In order to avoid single point failure, we need to set up another service such that if one service is getting down, next available service takes that load and continue to provide the service. Another requi