Kafka task#
Apache Kafka is a high-performance, open-source distributed event streaming platform developed by the Apache Software Foundation. It is designed for building real-time data pipelines and stream-processing applications that handle high-throughput, fault-tolerant, and scalable message flows across systems. (More info : Apache Kafak Link

⚠️ Download the required JAR files and copy them into the CrushFTP/plugins/lib/ directory:
Download kafka-clients.jar version 4.0.0 Link

Download kafka-streams.jar version 4.0.0 Link

After placing the files, restart the CrushFTP service to apply the changes.
More info:
• Supported Kafka Broker Version 4.0.x
• Minimum Java Version: Java 11 (supports 11, 17, 23)
The Kafka Task allows you to send either file contents or custom messages to a specified Kafka topic.

To use custom SASL connection settings, enable the Load custom client config flag. This allows you to provide your own client configuration properties.

Custom SASL Config example:
#KafkaTask SASL properties file example # Kafka server host and port. bootstrap.servers=192.168.0.10:9092 # Client id client.id=MyCrushFTP # Serializer (Do not change!) key.serializer=org.apache.kafka.common.serialization.StringSerializer value.serializer=org.apache.kafka.common.serialization.ByteArraySerializer #Max block and timeout max.block.ms=10000 request.timeout.ms=20000 #Authentication related settings security.protocol=SASL_PLAINTEXT sasl.mechanism=PLAIN sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \ username="kafka-username" \ password="kafka-password"; #Optional acks=all retries=3 metadata.max.age.ms=3000
Documentation Links:
• Apache Kafka – Security (SASL) Link

• JAAS Configuration for Kafka Clients Link

KafkaTask SASL properties Explained:
#
# Kafka server host and port:
• ⚠️ Specifies the address of the Kafka broker to connect to.
• You can list multiple brokers separated by commas for failover: 192.168.0.10:9092,192.168.0.11:9092
# Client id:
• A unique identifier for the client. This is useful for logging and monitoring within Kafka.
# Serializer (Do not change!):
• These specify how the key and value of each Kafka message are serialized.
• Key serializer: Converts the message key to a string.
• Value serializer: Converts the message content (e.g. a file) into a byte array.
• ⚠️ Do not change!
#Max block and timeout:
• max.block.ms: Maximum time (in milliseconds) a send call will block if the buffer is full. Here, 10 seconds.
• request.timeout.ms: Time before a request is considered failed due to no response. Set to 20 seconds here.
#Authentication related settings:
• security.protocol: Sets how the client communicates with Kafka.
sasl.jaas.config: • sasl.jaas.config: The JAAS (Java Authentication and Authorization Service) configuration line to authenticate with Kafka using SASL/PLAIN.
• username / password: ⚠️ Replace these with your actual Kafka credentials.
Kafka Security Protocols:
#
• SASL_PLAINTEXT means SASL authentication over a plaintext (non-encrypted) channel.
• SASL_SSL: is a security protocol that combines SSL encryption with SASL authentication. SSL ensures that all data transmitted is encrypted, while SASL (e.g., PLAIN or SCRAM) handles user authentication over that secure channel. This provides both confidentiality and identity verification for Kafka clients and brokers.
Example:
#Authentication related settings security.protocol=SASL_SSL sasl.mechanism=PLAIN sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \ username="kafka-username" \ password="kafka-password"; # SSL Truststore (used to verify the Kafka server cert) # The truststore contains public certificates of trusted servers — in this case, the Kafka server’s certificate. ssl.truststore.location=/Users/crushftp/kafka.server.truststore.jks ssl.truststore.password=truststorepass # Optional: Only needed if mutual TLS is required # It contains the client’s private key and its signed certificate (usually self-signed or signed by a CA). #Server Settings: ssl.client.auth=required -Tells the broker to request and validate a client certificate during the SSL handshake. ssl.keystore.location=/Users/crushftp/client.keystore.jks ssl.keystore.password=clientpass ssl.key.password=clientpass
• SASL_SSL with SCRAM-SHA-512 mechanism: This configuration provides strong security by combining SSL encryption with SCRAM-SHA-512 authentication. SSL ensures encrypted communication between clients and brokers, while SCRAM-SHA-512 performs secure password-based authentication using salted, hashed credentials. It’s a preferred option for production environments requiring both confidentiality and strong identity verification without relying on plaintext passwords.
Example:
#Authentication related settings security.protocol=SASL_SSL sasl.mechanism=SCRAM-SHA-512 sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \ username="kafka-username" \ password="kafka-password"; # SSL Truststore (used to verify the Kafka server cert) ssl.truststore.location=/Users/kz/crushftp/cert_or_keys/kafka.server.truststore.jks ssl.truststore.password=truststorepass # Optional: Only needed if mutual TLS is required ssl.keystore.location=/Users/kz/crushftp/cert_or_keys/client.keystore.jks ssl.keystore.password=clientpass ssl.key.password=clientpass
• SASL with GSSAPI / Kerberos: SASL/GSSAPI leverages Kerberos to perform strong, mutual authentication: clients obtain a service ticket from the KDC and present it to the broker, eliminating the need to send passwords over the wire. Once authenticated, it can be combined with SSL to provide both encryption and integrity for all Kafka traffic. This mechanism is ideal in enterprise environments where centralized credential management and single‐sign‐on are required.
When using CrushFTP’s KafkaTask with Kerberos, you can authenticate via a keytab. Example:
# Authentication related settings security.protocol=SASL_SSL sasl.mechanism=GSSAPI sasl.kerberos.service.name=kafka sasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required \ useKeyTab=true \ keyTab="/etc/security/kafka.keytab" \ principal="kafka@LMINT.COM"; # SSL Truststore (to verify the Kafka server cert) ssl.truststore.location=/etc/security/kafka.server.truststore.jks ssl.truststore.password=truststorepass # Optional: mutual TLS if broker demands it ssl.keystore.location=/etc/security/client.keystore.jks ssl.keystore.password=clientpass ssl.key.password=clientpass
Key points for GSSAPI / Kerberos setups:
1. /etc/krb5.conf must define your realm and KDC.
2.Make sure your keytab contains the exact Kafka service principal (in our example kafka@LMINT.COM). Without that entry (and the correct KVNO Wikipedia :Key Version Number Link

3. When you enable ssl.client.auth=required on the broker, every client must present a valid TLS certificate during the handshake. Make sure each client has a keystore containing its private key and signed certificate, and a truststore that includes the CA (or broker) certificate, so both sides can authenticate and encrypt the connection -> SSL keystore/truststore files must exist and match your broker’s certificates.
Optioanal:
acks=all: The producer will wait for all in-sync replicas (ISRs) to acknowledge the record before considering the write successful. Ensures that data isn’t lost if a broker crashes. Slightly increases latency compared to acks=1 or acks=0.
retries=3: The producer will retry sending a record up to 3 times if a transient failure occurs (e.g., timeout, disconnect).
metadata.max.age.ms=3000: The producer will refresh its cached metadata (e.g., topic partitions, broker info) every 3 seconds. Lower values increase responsiveness but may cause more overhead due to more frequent metadata fetches. Usually, it is 300,000 ms (5 minutes), so 3,000 ms is very aggressive and useful only in dynamic environments.
Add new attachment
Only authorized users are allowed to upload new attachments.
List of attachments
Kind | Attachment Name | Size | Version | Date Modified | Author | Change note |
---|---|---|---|---|---|---|
jar |
kafka-clients.jar | 9,187.7 kB | 2 | 26-May-2025 10:05 | krivacsz | |
jar |
kafka-streams.jar | 2,023.8 kB | 2 | 26-May-2025 10:05 | krivacsz | |
png |
kafka_custom_config.png | 214.3 kB | 1 | 05-Dec-2023 05:32 | krivacsz | |
png |
kafka_task.png | 79.1 kB | 1 | 05-Dec-2023 05:32 | krivacsz |
«
This page (revision-66) was last changed on 09-Jun-2025 03:30 by krivacsz
G’day (anonymous guest)
Log in
JSPWiki