Security configurations for Splunk Connect for Kafka
Splunk Connect for Kafka supports the following security processes:
- Secure Socket Layer (SSL)
- SASL/Generic Security Service Application Program Interface (GSSAPI/GSS-API) (Kerberos)
- Simple Authentication and Security Layer (SASL)/PLAIN
- SASL/Salted Challenge Response Authentication Mechanism (SCRAM)-SHA-256
- SASL/SCRAM-SHA-512
Configure SSL security for Splunk Connect for Kafka
Use the following information to configure SSL security for Splunk Connect for Kafka.
Configure a certificate for Kafka connector with Splunk
- Create a cert directory
mkdir ~/cert cd ~/cert
- Generate a Self-Signed Certificate
openssl req -newkey rsa:2048 -nodes -keyout kafka_connect.key \ -x509 -days 365 -out kafka_connect.crt
- Generate a certificate .pem file from .crt. The Splunk HTTP Event Collector (HEC) requires .pem format.
openssl x509 -in kafka_connect.crt -out kafka_connect.pem -outform PEM
- Generate a new keystore. You will need to create a password when generating the new keystore.
keytool -genkeypair -keyalg RSA -keystore keystore.jks
- Import Signed/Root/Intermediate Certificate
keytool -importcert -trustcacerts -file kafka_connect.crt \ -alias localhost -keystore keystore.jks
- Configure HEC using your certificate.
- Copy the certificate and key to Splunk
cp ~/cert/kafka_connect.key ~/splunk/etc/auth cp ~/cert/kafka_connect.pem ~/splunk/etc/auth
- In
$SPLUNK_HOME
, navigate to~/etc/apps/splunk_httpinput/local/
. - Open
inputs.conf
with a text editor and add the following lines under the[http]
stanza:[http] disabled = 0 enableSSL = 1 serverCert = ~/splunk/etc/auth/kafka_connect.pem <absolute path to your certificate> privKeyPath = ~/splunk/etc/auth/kafka_connect.key sslPassword = <your password of certificate private key>
- Restart your Splunk platform instance.
cd ~/splunk/bin ./splunk restart
- Copy the certificate and key to Splunk
Configure a certificate authority for your Kafka broker and the Splunk Connect for Kafka
-
Generate your own certificate authority (CA) certificate, and add the same CA certificate to each client and broker's truststore.
The following bash script generates the keystore and truststore for brokers (
kafka.server.keystore.jks
andkafka.server.truststore.jks
) and clients (kafka.client.keystore.jks
andkafka.client.truststore.jks
):#!/bin/bash PASSWORD=test1234 VALIDITY=365 keytool -keystore kafka.server.keystore.jks -alias localhost -validity $VALIDITY -genkey openssl req -new -x509 -keyout ca-key -out ca-cert -days $VALIDITY keytool -keystore kafka.server.truststore.jks -alias CARoot -import -file ca-cert keytool -keystore kafka.client.truststore.jks -alias CARoot -import -file ca-cert keytool -keystore kafka.server.keystore.jks -alias localhost -certreq -file cert-file openssl x509 -req -CA ca-cert -CAkey ca-key -in cert-file -out cert-signed -days $VALIDITY -CAcreateserial -passin pass:$PASSWORD keytool -keystore kafka.server.keystore.jks -alias CARoot -import -file ca-cert keytool -keystore kafka.server.keystore.jks -alias localhost -import -file cert-signed keytool -keystore kafka.client.keystore.jks -alias localhost -validity $VALIDITY -genkey keytool -keystore kafka.client.keystore.jks -alias localhost -certreq -file cert-file openssl x509 -req -CA ca-cert -CAkey ca-key -in cert-file -out cert-signed -days $VALIDITY -CAcreateserial -passin pass:$PASSWORD keytool -keystore kafka.client.keystore.jks -alias CARoot -import -file ca-cert keytool -keystore kafka.client.keystore.jks -alias localhost -import -file cert-signed
- Configure Kafka brokers.
- Navigate to
config/server.properties
, and add the following lines:listeners=SSL://localhost:9092 security.inter.broker.protocol=SSL ssl.enabled.protocols=TLSv1.2,TLSv1.1,TLSv1 ssl.client.auth=none ssl.keystore.type = JKS ssl.keystore.location=~/cert/kafka.server.keystore.jks ssl.keystore.password=test1234 ssl.key.password=test1234 ssl.truststore.type = JKS ssl.truststore.location=~/cert/kafka.server.truststore.jks ssl.truststore.password=test1234
- Save your changes.
- Navigate to
- Configure Kafka Connect
- Navigate to
config/connect-distributed.properties
, and add the following lines:bootstrap.servers=https://localhost:9092 security.protocol=SSL ssl.key.password=test1234 ssl.keystore.location=~/cert/kafka.client.keystore.jks ssl.keystore.password=test1234 ssl.truststore.location=~/cert/kafka.client.truststore.jks ssl.truststore.password=test1234 ssl.enabled.protocols=TLSv1.2,TLSv1.1 ssl.truststore.type=JKS # Authentication settings for Connect consumers used with sink connectors consumer.security.protocol=SSL consumer.ssl.key.password=test1234 consumer.ssl.keystore.location=~/cert/kafka.client.keystore.jks consumer.ssl.keystore.password=test1234 consumer.ssl.truststore.location=~/cert/kafka.client.truststore.jks consumer.ssl.truststore.password=test1234 consumer.ssl.enabled.protocols=TLSv1.2,TLSv1.1 consumer.ssl.truststore.type=JKS
- Save your changes.
- Navigate to
- Start your Kafka server and Kafka Connect
cd $KAFKA_HOME bin/zookeeper-server-start.sh config/zookeeper.properties bin/kafka-server-start.sh config/server.properties ./bin/connect-distributed.sh config/connect-distributed.properties
- Create a Kafka topic
- Create a new properties file named
client.properties
. This file is referenced when you use the command line tools to open a connection. Configure it to use your key store and trust store JKS files.security.protocol=SSL ssl.keystore.location=~/cert/kafka.client.keystore.jks ssl.keystore.password=test1234 <keystore password> ssl.key.password=test1234 <pkcs12 password> ssl.truststore.location=~/cert/kafka.client.truststore.jks ssl.truststore.password=test1234 <truststore password>
- Change the access permissions for your file system, in order to hide passwords.
chmod 0600 client.properties
- Use your client.properties file to make connections to a broker from the Kafka command line tools.
$ bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 -topic mytopic --consumer.config client.properties $ bin/kafka-console-producer.sh --broker-list localhost:9092 -topic mytopic --producer.config client.properties
- Create a new properties file named
-
Run the following command to create connector tasks. Use the following table as reference to adjust the command to fit your deployment.
Parameter Description topics
Configure the Kafka topic to be ingested. splunk.indexes
Set the destination Splunk indexes. splunk.hec.token
Set your Http Event Collector (HEC) token. splunk.hec.uri
URI for your destination Splunk HEC endpoint. curl localhost:8083/connectors -X POST -H "Content-Type: application/json" -d '{ "name": "ssl_validate_certs_true", "config": { "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector", "tasks.max": "1", "topics": "<YOUR_TOPIC>", "splunk.indexes": "<SPLUNK_INDEXES>", // mytopic "splunk.hec.uri": "<SPLUNK_HEC_URI:SPLUNK_HEC_PORT>", "splunk.hec.token": "<YOUR_TOKEN>", "splunk.hec.ssl.trust.store.path": "<KEYSTORE_LOCATION>", // ~/cert/keystore.jks "splunk.hec.ssl.trust.store.password": "<PASSWORD_KEYSTORE>", "splunk.hec.ack.enabled": "false", "splunk.hec.ssl.validate.certs": "true" } }'
Configure workers and SinkTasks to work with your SSL secured cluster
- Navigate to
$KAFKA_HOME/config/connect-distributed.properties
to configure the Kafka Connect worker and consumer settings to use SSL. - Adjust the settings
consumer.ssl.truststore.location
andssl.truststore.password
to reflect your setup.# Worker security are located at the top level security.protocol=SSL ssl.truststore.location=/var/private/ssl/kafka.client.truststore.jks ssl.truststore.password=test1234 # Sink security settings are prefixed with "consumer." consumer.security.protocol=SSL consumer.ssl.truststore.location=/var/private/ssl/kafka.client.truststore.jks consumer.ssl.truststore.password=test1234
There is currently no way to change the configuration for connectors individually, but if your server supports client authentication over SSL, use a separate principal for the worker and the connectors. See Confluent's documentation on configuring workers and connectors with security for more information.
- Start Kafka Connect.
./bin/connect-distributed.sh config/connect-distributed-quickstart.properties
SASL/GSSAPI (Kerberos)
Configure Kafka Connect when your Kafka cluster is secured using Kerberos.
- Configure the Kafka Connect worker and consumer settings to use Kerberos in
$KAFKA_HOME/config/connect-distributed.properties
.# Worker security are located at the top level security.protocol=SASL_PLAINTEXT sasl.mechanism=GSSAPI # Sink security settings are prefixed with "consumer." consumer.sasl.mechanism=GSSAPI consumer.security.protocol=SASL_PLAINTEXT sasl.kerberos.service.name=kafka
- Modify
bin/connect-distributed.sh
by editing theEXTRA_ARGS
environment variable. - Pass in the location of the JAAS conf file. Optionally, you can specify the path to your Kerberos configuration file and set Kerberos debugging to
true
for troubleshooting connection issues.EXTRA_ARGS=${EXTRA_ARGS-'-name connectDistributed -Djava.security.krb5.conf=/etc/krb5.conf -Djava.security.auth.login.config=/root/kafka_connect_jaas.conf -Dsun.security.krb5.debug=true'}
For example, a Kafka Client JAAS file using the principal connect:
KafkaClient { com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true storeKey=true keyTab="/etc/security/keytabs/connect.keytab" principal="connect/_HOST@REALM"; };
- Start Kafka Connect.
./bin/connect-distributed.sh config/connect-distributed.properties
Modify the keytab and principal settings to reflect your environment.
See Confluent's documentation for more information on configuring Kafka Connect using JAAS.
SASL/PLAIN
Do not run SASL/PLAIN in production without SSL.
Configure Kafka Connect worker and consumer settings to use SASL/PLAIN:
- Configure the Kafka Connect worker and consumer settings to use SASL/PLAIN in
$KAFKA_HOME/config/connect-distributed.properties
.# Worker security are located at the top level security.protocol=SASL_SSL sasl.mechanism=PLAIN # Sink security settings are prefixed with "consumer." consumer.security.protocol=SASL_SSL consumer.sasl.mechanism=PLAIN
- Navigate to
$KAFKA_HOME/config/connect-distributed.properties
and edit theEXTRA_ARGS
environment variable. - Pass in the location of the JAAS conf file.
EXTRA_ARGS=${EXTRA_ARGS-'-name connectDistributed -Djava.security.auth.login.config=/root/kafka_connect_jaas.conf'}
For example, a Kafka Client JAAS file for SASL/PLAIN.
KafkaClient { org.apache.kafka.common.security.plain.PlainLoginModule required username="alice" password="alice-secret"; };
- Start Kafka Connect.
./bin/connect-distributed.sh config/connect-distributed.properties
See Confluent's documentation for more information on configuring Kafka Connect using SASL/PLAIN.
SASL/SCRAM-SHA-256 and SASL/SCRAM-SHA-512
Configure the Kafka Connect worker and consumer settings to use SASL/SCRAM:
- Navigate to
$KAFKA_HOME/config/connect-distributed.properties
and make the following adjustments:# Worker security are located at the top level security.protocol=SASL_SSL sasl.mechanism=SCRAM-SHA-256 (or SCRAM-SHA-512) # Sink security settings are prefixed with "consumer." consumer.security.protocol=SASL_SSL consumer.sasl.mechanism=SCRAM-SHA-256 (or SCRAM-SHA-512)
- Modify
bin/connect-distributed.sh
by editing theEXTRA_ARGS
environment variable. Pass in the location of the JAAS configuration file.EXTRA_ARGS=${EXTRA_ARGS-'-name connectDistributed -Djava.security.auth.login.config=/root/kafka_connect_jaas.conf'}
For example, a Kafka Client JAAS file for SASL/SCRAM:
KafkaClient { org.apache.kafka.common.security.scram.ScramLoginModule required username="alice" password="alice-secret"; };
- Start Kafka Connect
./bin/connect-distributed.sh config/connect-distributed.properties
Workers and SinkTasks now work with your SASL/SCRAM secured cluster. See Confluent's documentation for more information on configuring Kafka Connect using JAAS.
Configure Splunk Connect for Kafka | Load balancing configurations for Splunk Connect for Kafka |
This documentation applies to the following versions of Splunk® Connect for Kafka: 2.2.2
Feedback submitted, thanks!