Add new attachment

Only authorized users are allowed to upload new attachments.

List of attachments

Kind Attachment Name Size Version Date Modified Author Change note
jar
kafka-clients.jar 9,187.7 kB 2 26-May-2025 10:05 krivacsz
jar
kafka-streams.jar 2,023.8 kB 2 26-May-2025 10:05 krivacsz
png
kafka_custom_config.png 214.3 kB 1 05-Dec-2023 05:32 krivacsz
png
kafka_task.png 79.1 kB 1 05-Dec-2023 05:32 krivacsz

This page (revision-66) was last changed on 09-Jun-2025 03:30 by krivacsz

This page was created on 05-Dec-2023 05:32 by krivacsz

Only authorized users are allowed to rename pages.

Only authorized users are allowed to delete pages.

Difference between version and

At line 1 changed one line
!!Kafka task
!!!Kafka task
At line 3 changed 2 lines
__Apache Kafka__ is a high-performance, open-source distributed event streaming platform developed by the Apache Software Foundation. It is designed for building real-time data pipelines and stream-processing applications that handle high-throughput, fault-tolerant, and scalable message flows across systems. It is an open-source stream-processing software platform developed by the Apache Software Foundation.
(More info : [Apache Kafak Link|https://kafka.apache.org/])\\
__Apache Kafka__ is a high-performance, open-source distributed event streaming platform developed by the __Apache Software Foundation__. It is designed for building real-time data pipelines and stream-processing applications that handle high-throughput, fault-tolerant, and scalable message flows across systems. (More info : [Apache Kafak Link|https://kafka.apache.org/])\\
At line 7 changed 3 lines
[kafka-clients.jar]\\
[kafka-streams.jar]\\
Then restart the CrushFTP service.\\
[Download kafka-clients.jar version 4.0.0 Link|kafka-clients.jar]\\
[Download kafka-streams.jar version 4.0.0 Link|kafka-streams.jar]\\
After placing the files, restart the CrushFTP service to apply the changes.\\
__More info:__\\
• Supported Kafka Broker Version 4.0.x\\
• Minimum Java Version: Java 11 (supports 11, 17, 23)\\
At line 16 changed one line
It is possible to load your own SASL connection parameters with the flag "Load custom client config"\\
To use custom __SASL__ connection settings, enable the __Load custom client config__ flag. This allows you to provide your own client configuration properties.\\
At line 20 changed one line
Config example:
__Custom SASL Config example__:
At line 26 changed one line
bootstrap.servers=192.168.10:9092
bootstrap.servers=192.168.0.10:9092
At line 47 added 82 lines
#Optional
acks=all
retries=3
metadata.max.age.ms=3000
}}}\\
\\
Documentation Links:\\
• [Apache Kafka – Security (SASL) Link| https://kafka.apache.org/documentation/#security_sasl]\\
• [JAAS Configuration for Kafka Clients Link |https://kafka.apache.org/documentation/#security_sasl_clientconfig]\\
\\
!! KafkaTask SASL properties Explained:\\
\\
__# Kafka server host and port__:\\
• ⚠️ Specifies the address of the Kafka broker to connect to.\\
• You can list multiple brokers separated by commas for failover: 192.168.0.10:9092,192.168.0.11:9092\\
\\
__# Client id__:\\
• A unique identifier for the client. This is useful for logging and monitoring within Kafka.\\
\\
__# Serializer (Do not change!)__:\\
• These specify how the key and value of each Kafka message are serialized.\\
• Key serializer: Converts the message key to a string.\\
• Value serializer: Converts the message content (e.g. a file) into a byte array.\\
• ⚠️ Do not change!\\
\\
__#Max block and timeout__:\\
• max.block.ms: Maximum time (in milliseconds) a __send__ call will block if the buffer is full. Here, 10 seconds.\\
• request.timeout.ms: Time before a request is considered failed due to no response. Set to 20 seconds here.\\
\\
__#Authentication related settings__:\\
• security.protocol: Sets how the client communicates with Kafka.\\
__sasl.jaas.config__:
• sasl.jaas.config: The JAAS (Java Authentication and Authorization Service) configuration line to authenticate with Kafka using SASL/PLAIN.\\
• username / password: ⚠️ Replace these with your actual Kafka credentials.\\
\\
! Kafka Security Protocols:\\
\\
----
• __SASL_PLAINTEXT__ means SASL authentication over a plaintext (non-encrypted) channel.\\
----
• __SASL_SSL__: is a security protocol that combines SSL encryption with SASL authentication. SSL ensures that all data transmitted is encrypted, while SASL (e.g., PLAIN or SCRAM) handles user authentication over that secure channel. This provides both confidentiality and identity verification for Kafka clients and brokers. \\
Example:
{{{
#Authentication related settings
security.protocol=SASL_SSL
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
username="kafka-username" \
password="kafka-password";
# SSL Truststore (used to verify the Kafka server cert)
# The truststore contains public certificates of trusted servers — in this case, the Kafka server’s certificate.
ssl.truststore.location=/Users/crushftp/kafka.server.truststore.jks
ssl.truststore.password=truststorepass
# Optional: Only needed if mutual TLS is required
# It contains the client’s private key and its signed certificate (usually self-signed or signed by a CA).
#Server Settings: ssl.client.auth=required -Tells the broker to request and validate a client certificate during the SSL handshake.
ssl.keystore.location=/Users/crushftp/client.keystore.jks
ssl.keystore.password=clientpass
ssl.key.password=clientpass
}}}\\
----
• __SASL_SSL with SCRAM-SHA-512 mechanism__: This configuration provides strong security by combining __SSL__ encryption with __SCRAM-SHA-512__ authentication. SSL ensures encrypted communication between clients and brokers, while SCRAM-SHA-512 performs secure password-based authentication using salted, hashed credentials. It’s a preferred option for production environments requiring both confidentiality and strong identity verification without relying on plaintext passwords.\\
\\
Example:\\
{{{
#Authentication related settings
security.protocol=SASL_SSL
sasl.mechanism=SCRAM-SHA-512
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
username="kafka-username" \
password="kafka-password";
# SSL Truststore (used to verify the Kafka server cert)
ssl.truststore.location=/Users/kz/crushftp/cert_or_keys/kafka.server.truststore.jks
ssl.truststore.password=truststorepass
# Optional: Only needed if mutual TLS is required
ssl.keystore.location=/Users/kz/crushftp/cert_or_keys/client.keystore.jks
ssl.keystore.password=clientpass
ssl.key.password=clientpass
At line 130 added 33 lines
----
• __SASL with GSSAPI / Kerberos__: SASL/GSSAPI leverages Kerberos to perform strong, mutual authentication: clients obtain a service ticket from the KDC and present it to the broker, eliminating the need to send passwords over the wire. Once authenticated, it can be combined with SSL to provide both encryption and integrity for all Kafka traffic. This mechanism is ideal in enterprise environments where centralized credential management and single‐sign‐on are required.\\
\\
When using CrushFTP’s KafkaTask with Kerberos, you can authenticate via a __keytab__. Example:\\
{{{
# Authentication related settings
security.protocol=SASL_SSL
sasl.mechanism=GSSAPI
sasl.kerberos.service.name=kafka
sasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required \
useKeyTab=true \
keyTab="/etc/security/kafka.keytab" \
principal="kafka@LMINT.COM";
# SSL Truststore (to verify the Kafka server cert)
ssl.truststore.location=/etc/security/kafka.server.truststore.jks
ssl.truststore.password=truststorepass
# Optional: mutual TLS if broker demands it
ssl.keystore.location=/etc/security/client.keystore.jks
ssl.keystore.password=clientpass
ssl.key.password=clientpass
}}}\\
__Key points for GSSAPI / Kerberos setups__:\\
1. /etc/krb5.conf must define your realm and KDC.\\
2.Make sure your keytab contains the exact Kafka service principal (in our example kafka@LMINT.COM). Without that entry (and the correct KVNO [Wikipedia :Key Version Number Link|https://en.wikipedia.org/wiki/Kerberos_(protocol)#Key_Version_Number_(KVNO)]), the broker won’t be able to decrypt or verify incoming GSSAPI tickets.\\
3. When you enable ssl.client.auth=required on the broker, every client must present a valid TLS certificate during the handshake. Make sure each client has a keystore containing its private key and signed certificate, and a truststore that includes the CA (or broker) certificate, so both sides can authenticate and encrypt the connection -> SSL keystore/truststore files must exist and match your broker’s certificates.\\
----
__Optioanal:__\\
\\
__acks=all__: The producer will wait for all in-sync replicas (ISRs) to acknowledge the record before considering the write successful. Ensures that data isn’t lost if a broker crashes. Slightly increases latency compared to acks=1 or acks=0. \\
__retries=3__: The producer will retry sending a record up to 3 times if a transient failure occurs (e.g., timeout, disconnect).\\
__metadata.max.age.ms=3000__: The producer will refresh its cached metadata (e.g., topic partitions, broker info) every 3 seconds. Lower values increase responsiveness but may cause more overhead due to more frequent metadata fetches. Usually, it is 300,000 ms (5 minutes), so 3,000 ms is very aggressive and useful only in dynamic environments.\\
Version Date Modified Size Author Changes ... Change note
66 09-Jun-2025 03:30 8.882 kB krivacsz to previous
65 09-Jun-2025 03:29 8.894 kB krivacsz to previous | to last
64 29-May-2025 03:38 8.879 kB krivacsz to previous | to last
63 29-May-2025 03:35 8.766 kB krivacsz to previous | to last
62 29-May-2025 03:31 8.671 kB krivacsz to previous | to last
61 29-May-2025 03:13 8.65 kB krivacsz to previous | to last
« This page (revision-66) was last changed on 09-Jun-2025 03:30 by krivacsz
G’day (anonymous guest)
CrushFTP11 | What's New

Referenced by
...nobody

JSPWiki