Securing JMX on Confluent Kafka Confluent kafka process start with these default arguments. You can see that JMX authentication is disabled by default. This is a security vulnerability and might lead to possible issues. Confluent kafka consists of the following services. We need to enable authentication for all of these services for JMX: Kafka Broker… Continue reading »
Kafka Tools – kafkacat – non-JVM Kafka producer / consumer kafkacat is an amazing kafka tool based on librdkafka library, which is a C/C++ library for kafka. It means that it doesn’t have dependency on JVM to work with kafka data as administrator. It can be used to consume and produce messages from kafka topics…. Continue reading »
Apache Flink – Change Port for web front end Apache Flink runs the dashboard on port 8081. Since this is a common port there might be conflict with some other services running on the same machines. You might encounter this scenario especially during development when many services are running on the your development machine. But… Continue reading »
Kafka-Consumer-Groups – Kafka CLI Tools Kafka-Consumer-Groups is a CLI tool which can be used to get the message consumption from Kafka. The tool can be used to get the list of topics and partitions consumed by a consumer group. It also details if a consumer is lagging in consumption of the messages. First we need… Continue reading »
Deleting Connectors from Kafka Connect A connector can be deleting using Kafka Connect’s REST API. Here is the curl command:
Kafka Streams – Resetting Application State I the previous post, we discussed that we might need to reprocess data during development during application development. Since Kafka Streams ensures the application state, it doesn’t pull and reprocess data. In this case, you might find yourself keep waiting for the join operations to get triggered, but to… Continue reading »
Kafka Connect Issues: Found null value for non-optional schema In a custom connector, you might encounter the following issue: Here we have a schema defined for one of our custom connector: Now you would see the error when you create a SourceRecord with a value missing the non-optional fields.
Building Serde for Kafka Streams Application While developing Kafka Streams applications, I found myself requiring some utility code over and over. One such code is to build Serde for custom types. It is better if it is refactored into separate types and used when needed. It has two methods specificAvroSerde and genericAvroSerde. It must be… Continue reading »
Kafka Tools – kafka.tools.GetOffsetShell GetOffsetShell can be used to get the last offsets of a topic or individual partitions of a topic. The tool is available with your Kafka installation. It can be run using kafka-run-class. Here we are getting the offsets for Merged-SensorsHeartbeat topic.
Kafka Tools – Mirror Maker MirrorMaker is a Kafka tools for copying data from one cluster to the other. In this post we are going to see how we can run Mirror Maker to copy data from one cluster to the other. This can be specially useful when we want to copy data between two… Continue reading »