Separate log file for connector

Separate log file for connector The logs for Kafka connectors are generally added to connectDistributed.out. If you want to create a separate log file for your connector then you can update the following log4j.properties file. ../etc/kafka/connect-log4j.properties Add the following lines to the file: log4j.appender.connectorNameSpace=org.apache.log4j.RollingFileAppender log4j.appender.connectorNameSpace.DatePattern=’.’yyyy-MM-dd-HH log4j.appender.connectorNameSpace.MaxFileSize=10MB log4j.appender.connectorNameSpace.MaxBackupIndex=10 log4j.appender.connectorNameSpace.File=${kafka.logs.dir}/connect-my.log log4j.appender.connectorNameSpace.layout=org.apache.log4j.PatternLayout log4j.appender.connectorNameSpace.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L – %m%n… Continue reading »

Angular – Progressive Web Apps using Service Workers

Angular – Progressive Web Apps using Service Workers Service workers enable creating progressive web applications. It is a script running in the web browser supporting caching of the application resources. They preserve the resources even after the user closes the tab and serves them when the application is requested again. Setting up Angular Project Let’s… Continue reading »

Hyperledger Fabric – Terminology

Hyperledger Fabric – Terminology Hyperledger Fabric is private and permissioned. In order to gain access, the peers need to be enrolled through a Membership Service Provider (MSP) . It has a deterministic consensus algorithm. Asset An asset holds a state and has ownership. They are key / value pairs representing a value, which enables to… Continue reading »

Hyperledger Fabric Composer & web sandbox

Hyperledger Fabric Composer & web sandbox Hyperledger Composer is an framework that allows development of Hyperledger fabric based blockchain applications. It must be noted that it has been deprecated with Fabric 1.4 released in August 2019. This is the first long term support release for Hyperledger fabric. It is pledged by maintainers to provide bug… Continue reading »

Kafka Connect – Externalizing Secrets – KIP 297

Kafka Connect – Externalizing Secrets – KIP 297 In order to connect with a data source or sink, we need to use credentials. Kafka Connect added support for specifying credential using config providers. The support for file config provider is available with the installation package. This is discussed in KIP 297. The KIP was released… Continue reading »

Securing JMX on Confluent Kafka

Securing JMX on Confluent Kafka Confluent kafka process start with these default arguments. You can see that JMX authentication is disabled by default. This is a security vulnerability and might lead to possible issues. Confluent kafka consists of the following services. We need to enable authentication for all of these services for JMX: Kafka Broker… Continue reading »