Kafka Tools – kafka.tools.GetOffsetShell GetOffsetShell can be used to get the last offsets of a topic or individual partitions of a topic. The tool is available with your Kafka installation. It can be run using kafka-run-class. Here we are getting the offsets for Merged-SensorsHeartbeat topic.
Kafka Streams Config’s General Properties In this post, I just need to add Kafka Stream configuration which I have to use over and over again for a Kafka streams application. This can be used when developing a new Kafka Streams application. First we need to introduce configurations for Stream. They include details of broker and… Continue reading »
Kafka Streams: Converting KStream to KTable There are no methods in KStream to convert into a KTable. But there are workarounds to do that. They are as follows: Write on a Kafka Topic and read back as Kable We can also write to an intermediate Kafka Topic and read it back using StreamBuilder. Here is… Continue reading »
Using Landoop’s Schema Registry UI In this post, we are going to setup Schema Registry UI for Confluent’s schema registry using Docker Image for the tool. Schema Registry is an amazing tool by Landoop. It’s available in Github. Let’s first pull the image from Docker Hub. Now we can simply run the docker image by… Continue reading »
Installing Confluent Platform on Mac OS There are no direct download instructions available on the confluent platform yet to install the packages on Mac OS. I just want to share it with others just in case someone might need some help with this. Download the Archive The zip packages can be downloaded from here: Download…. Continue reading »
Confluent Components & Their HTTP Ports It is very important to remember the ports of individual components of Confluent Platform. The following is the list of default ports for the respective components. It must be remembered that these ports can be overridden during deployment. More details can be found on this page on confluent.io where… Continue reading »
Kafka and Confluent logs in Log File During development and debugging, it is very useful to see traces for Kafka and confluent in log file to determine the issues. The traces are specially useful when we use DSL for Kafka Streams. Here is the logback configuration to use logback appenders for kafka and confluent logs.
Log4J Dependency with Kafka Many of the Kafka artifacts have Log4J dependencies. If you are using logback then your logging stops all of a sudden. Here is an example for one of our sample app. Here we have added such dependency and the App seems to have issues with multiple SL4J bindings. It has finally… Continue reading »
Kafka Message Size Issue – Record Too Long to Send As soon as the message size increases, Kafka starts throwing errors when sending the message. Here are the error message we received when sending a message with size a few mege bytes. The message clearly suggests that the size needs to updated in some configuration… Continue reading »
Using Avro Schema in JVM based applications Like Protobuf, Apache Avro is specially suitable for organizations with polyglot development stack. Data serialized through one language can easily be deserialized in other language on the other end of your messaging platform. This is one of the supported serialization for Confluent platform. The benefits of Avro over… Continue reading »