Splunk® Data Stream Processor

Getting Data In

On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Create a connection for the DSP Kafka SSL Connector

Use the DSP Kafka SSL Connector to get data from an Apache or Confluent Kafka server. The Kafka SSL Connector supports two-way SSL authentication where the client and server authenticate each other using the SSL/TLS protocol. Before you can use any connector, you must create a connection.

Requirements

  • At least one Kafka server with SSL enabled
  • A Java keystore and truststore on each Kafka server with all certificates signed by a Certificate Authority (CA).
  • A client private key, client certificate, and the CA certificate used to sign the client certificate.
  • Apache Kafka version 10 or higher or Confluent Kafka version 3.0 or higher

If you haven't yet created a Kafka client keystore to interact with your SSL-enabled Kafka server, use this topic to create a Kafka client keystore and extract the key and certificates in the format needed to upload into the DSP Kafka SSL Connection.

If you already have a Kafka client keystore and truststore where the RSA algorithm was used to generate the key/certificate pair and the client certificate was signed by the same CA certificate used to sign the server certificate, then skip to the section "Export the key and certificates in pem format".

Create the client keystore

  1. From a command line interface, use the Java keytool command to create the client keystore.
    keytool -keystore client.keystore.jks -alias localhost -validity 365 -genkey -keyalg RSA
    
  2. Set a password and enter some metadata for the tool. You can set any password and metadata, including leaving the metadata fields blank.

After entering the required information, a client.keystore.jks file is saved in your directory.

Sign the client certificate with a CA certificate

Sign the client certificate with the same CA certificate you used to sign the Kafka server certificate in the server keystore file. In this example, the CA certificate is in the file ca-cert in PEM format and starts with -----BEGIN CERTIFICATE-----. The CA key is in the file ca-key in PEM format and starts with -----BEGIN ENCRYPTED PRIVATE KEY-----.

  1. Export the client certificate so that it can be signed.
    keytool -keystore client.keystore.jks -alias localhost -certreq -file cert-file
  2. Sign the client certificate with the CA certificate.
    openssl x509 -req -CA ca-cert -CAkey ca-key -in cert-file -out cert-signed -days 365 -CAcreateserial -passin pass:{ca-password}
  3. Add the CA certificate to the keystore.
    keytool -keystore client.keystore.jks -alias CARoot -import -file ca-cert
  4. Add the signed client certificate to the keystore.
    keytool -keystore client.keystore.jks -alias localhost -import -file cert-signed

Export the key and certificates in pem format

  1. Convert a JKS Keystore to PKCS12 (.p12) format. Use the same password for both keystores.
    keytool -importkeystore -srckeystore client.keystore.jks -destkeystore client.keystore.p12  -srcstoretype jks -deststoretype pkcs12
  2. Export the certificates to the file certs.pem.
    openssl pkcs12 -in client.keystore.p12 -nokeys -out certs.pem -passin pass:{keystore-password}
  3. Export the client private key unencrypted in PEM format, and convert it to RSA format. Save the formatted private key to the file privkey.pem:
    openssl pkcs12 -in client.keystore.p12 -nocerts -nodes -passin pass:{keystore-password} | openssl rsa -out privkey.pem
    
  4. The certs.pem file contains multiple certificates. Copy the certificates labeled as friendlyName: CAroot and the one labeled as friendlyName:localhost into their own separate files and save the files. You'll need to upload these certificates to configure your connection.

Configure the Kafka connection in the Data Stream Processor UI

Now that you have the required certificates and keys, create a connection in the Data Stream Processor UI.

  1. Navigate to the Data Stream Processor UI, and click on Data Management > Connections. Click Create New Connection.
  2. Complete the following fields:
    Field Description
    Name The connection name.
    Description (Optional) A description of your connection.
    Kafka brokers A comma-separated list of your Kafka brokers. You must enter at least one broker.
    Client Private Key The privkey.pem file, beginning with -----BEGIN RSA PRIVATE KEY----- and ending with -----END RSA PRIVATE KEY-----.
    Client Certificate The client, or localhost certificate from step 4 of "Export the key and certificates in pem format", beginning with -----BEGIN CERTIFICATE----- and ending with -----END CERTIFICATE-----.
    CA or Kafka Server Cert The original ca-cert file. You can also copy the CAroot certificate from the certs.pem file, save the certificate as a separate file, and upload that file.
    Kafka properties (Optional) Enter any consumer properties by which you want to delimit your data. To enter more than one property, click Add input for every new property you want to add.

    The Kafka source and sink functions that use this connection automatically set a list of Kafka properties that can't be overwritten. See the "Kafka properties" section of this topic.

    The data uploaded when creating the Kafka SSL connection is transmitted securely by HTTPs. The client private key is encrypted and stored securely in a secrets manager.

  3. Click Save.
  4. (Optional) If you are editing a connection that's being used by an active pipeline, reactivate that pipeline after making your changes.

You can now use your connection in a data pipeline. See Deserialize and send Kafka data from a DSP pipeline.

Kafka properties

When using the Kafka SSL connector, the Read from Apache Kafka with SSL and Write to Kafka with SSL functions automatically set the following Kafka properties. These properties can't be overwritten.

ssl.truststore.location=/local/path/to/kafka.client.truststore.jks
ssl.truststore.password=<randomized-password>
ssl.keystore.location=/local/path/to/kafka.client.keystore.jks
ssl.keystore.password=<randomized-password>
ssl.key.password=<randomized-password>
ssl.keystore.type=JKS
ssl.truststore.type=JKS
security.protocol=SSL
Last modified on 05 March, 2020
Create a connection for the DSP Azure Event Hubs Connector with an SAS key   Get data in with the Collect service and a pull-based connector

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.0.1


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters