Security configurations
Splunk Connect for Kafka supports the following security processes:
- SSL
- SASL/GSSAPI (Kerberos) versions 0.9.0.0 and later.
- SASL/PLAIN versions 0.10.0.0 and later.
- SASL/SCRAM-SHA-256 versions 0.10.2.0 and later.
- SASL/SCRAM-SHA-512 versions 0.10.2.0 and later.
SSL
Configure workers and SinkTasks to work with your SSL secured cluster:
- Navigate to
$KAFKA_CONNECT_HOME/config/connect-distributed.properties
to configure the Kafka Connect worker and consumer settings to use SSL. - Adjust the settings
consumer.ssl.truststore.location
andssl.truststore.password
to reflect your setup.# Worker security are located at the top level security.protocol=SSL ssl.truststore.location=/var/private/ssl/kafka.client.truststore.jks ssl.truststore.password=test1234 # Sink security settings are prefixed with "consumer." consumer.security.protocol=SSL consumer.ssl.truststore.location=/var/private/ssl/kafka.client.truststore.jks consumer.ssl.truststore.password=test1234
There is currently no way to change the configuration for connectors individually, but if your server supports client authentication over SSL, use a separate principal for the worker and the connectors. See Confluent's documentation on configuring workers and connectors with security for more information.
- Start Kafka Connect.
./bin/connect-distributed.sh config/connect-distributed-quickstart.properties
SASL/GSSAPI (Kerberos)
Configure Kafka Connect when your Kafka cluster is secured using Kerberos.
- Configure the Kafka Connect worker and consumer settings to use Kerberos in
config/connect-distributed.properties
.# Worker security are located at the top level security.protocol=SASL_PLAINTEXT sasl.mechanism=GSSAPI # Sink security settings are prefixed with "consumer." consumer.sasl.mechanism=GSSAPI consumer.security.protocol=SASL_PLAINTEXT sasl.kerberos.service.name=kafka
- Modify
bin/connect-distributed.sh
by editing theEXTRA_ARGS
environment variable. - Pass in the location of the JAAS conf file. Optionally, you can specify the path to your Kerberos configuration file and set Kerberos debugging to
true
for troubleshooting connection issues.EXTRA_ARGS=${EXTRA_ARGS-'-name connectDistributed -Djava.security.krb5.conf=/etc/krb5.conf -Djava.security.auth.login.config=/root/kafka_connect_jaas.conf -Dsun.security.krb5.debug=true'}
For example, a Kafka Client JAAS file using the principal connect:
KafkaClient { com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true storeKey=true keyTab="/etc/security/keytabs/connect.keytab" principal="connect/_HOST@REALM"; };
Modify the keytab and principal settings to reflect your environment.
- Start Kafka Connect.
./bin/connect-distributed.sh config/connect-distributed.properties
See the Apache Kafka documentation for more information on configuring Kafka Connect using JAAS.
SASL/PLAIN
Do not run SASL/PLAIN in production without SSL.
Configure Kafka Connect when your Kafka Cluster is secured using SASL/PLAIN:
- Configure the Kafka Connect worker and consumer settings to use SASL/PLAIN in
config/connect-distributed.properties
.# Worker security are located at the top level security.protocol=SASL_SSL sasl.mechanism=PLAIN # Sink security settings are prefixed with "consumer." consumer.security.protocol=SASL_SSL consumer.sasl.mechanism=PLAIN
- Navigate to
bin/connect-distributed.sh
and edit theEXTRA_ARGS
environment variable. - Pass in the location of the JAAS conf file.
EXTRA_ARGS=${EXTRA_ARGS-'-name connectDistributed -Djava.security.auth.login.config=/root/kafka_connect_jaas.conf'}
For example, a Kafka Client JAAS file for SASL/PLAIN.
KafkaClient { org.apache.kafka.common.security.plain.PlainLoginModule required username="alice" password="alice-secret"; };
- Start Kafka Connect.
./bin/connect-distributed.sh config/connect-distributed.properties
See Confluent's documentation for more information on configuring Kafka Connect using SASL/PLAIN.
SASL/SCRAM-SHA-256 and SASL/SCRAM-SHA-512
Configure the Kafka Connect worker and consumer settings to use SASL/SCRAM:
- Navigate to
config/connect-distributed.properties
and make the following adjustments:- Worker security are located at the top level
security.protocol=SASL_SSL sasl.mechanism=SCRAM-SHA-256 (or SCRAM-SHA-512)
- Sink security settings are prefixed with
consumer
consumer.security.protocol=SASL_SSL consumer.sasl.mechanism=SCRAM-SHA-256 (or SCRAM-SHA-512)
- Worker security are located at the top level
- Modify
bin/connect-distributed.sh
by editing theEXTRA_ARGS
environment variable. - Pass in the location of the JAAS configuration file.
EXTRA_ARGS=${EXTRA_ARGS-'-name connectDistributed -Djava.security.auth.login.config=/root/kafka_connect_jaas.conf'}
For example, a Kafka Client JAAS file for SASL/SCRAM:
KafkaClient { org.apache.kafka.common.security.scram.ScramLoginModule required username="alice" password="alice-secret"; };
- Start Kafka Connect
./bin/connect-distributed.sh config/connect-distributed.properties
Workers and SinkTasks now work with your SASL/SCRAM secured cluster. See Confluent's documentation for more information on configuring Kafka Connect using JAAS.
Parameters | Load balancing |
This documentation applies to the following versions of Splunk® Connect for Kafka: 1.0.0
Feedback submitted, thanks!