Monthly Archives: December 2018

Removing Confluent Kafka from RHEL7

Removing Confluent Kafka from RHEL7

Confluent packages can simply be removed the same way they are installed. We can download the package and run the services separately. But if you have it installed through yum, it can be as easily removed. Here we are removing it using yum remove command.

yum-remove

yum-remove

After getting a confirmation, it erases the installed components as follows:

is_ok

is_ok

As the older version is removed, we can get the new packages and directly run it. Here we have downloaded the whole package and running it using confluent CLI directly from bin folder:

confluent start

confluent start

com.esotericsoftware.kryo.KryoException: Unusual solution upgrading Flink

com.esotericsoftware.kryo.KryoException: Unusual solution upgrading Flink

While upgrading from Flink from 1.4 to a newer version 1.6.1, there are a few build issues. After we fix the issues, suddenly we started getting KryoException. Why are we getting this and how it this related to the upgrade? Let’s start with the exception message. If you try to start the job a few times and you have a few Source tasks, you would find it failing at different stages each time. That is based on the data received by Source tasks from topic. Here we are using Kafka source in Flink.

The biggest issue was the code ran fine in IntelliJ but as we run it in a cluster, it throws this error.

And restarting the job, it started throwing error another schema missing.

There are a number of solutions recommended by different developers and some of them have worked in certain cases.

Disabling Kryo fallback
Flink allows us to disable Kryo fallback. Here are the details:
https://ci.apache.org/projects/flink/flink-docs-master/dev/execution_configuration.html

But this didn’t work for us. We still kept getting the same errors as we run the job.

Solution

The solution was not to package the flink runtime libraries with the fat jar package built using sbt assembly. We can do that by adding dependencies as follows:

Flink Upgrade SBT Assembly Merge Strategy

Flink Upgrade SBT Assembly Merge Strategy

We decided to upgrade Flink to 1.6.1. As we updated the code, it was running fine on the development machine. I realize that I started getting build issues as soon as I pushed out to source control. It was picked up by the Teamcity build agent and it threw this error:

Actually we were using sbt assembly and we found that it is reproducible on our machines too.

So the natural solution was to just pick one log4j.properties file and try to atleast get it to build:

It did get us through the issue but started getting another error:

So we just added flink-avro dependency to sbt file and the issue was fixed.
https://ci.apache.org/projects/flink/flink-docs-master/dev/batch/connectors.html#avro-support-in-flink