Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

On April 3, 2023, Splunk Data Stream Processor reached its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.

All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.

Create an SSL-authenticated DSP connection to Kafka

To get data from an Apache Kafka or Confluent Kafka broker into a data pipeline in Splunk Data Stream Processor, you must first create a connection. You can then use the connection in the Kafka source function to get data from Kafka into a DSP pipeline. If you have a Universal license, you can also create a connection for the Send to Kafka sink function to send data from DSP to a Kafka topic. See Licensing for the Splunk Data Stream Processor in the Install and administer the Data Stream Processor manual.

To protect your data, create a connection that uses the SSL Connector for Kafka. This connector uses two-way SSL authentication where DSP and the Kafka brokers authenticate each other using the SSL protocol, and also encrypts all connections using SSL.

For information about other methods for connecting DSP to Kafka brokers, see Create a SASL-authenticated DSP connection to Kafka and Create an unauthenticated DSP connection to Kafka.

Prerequisites

Before you can create an SSL-authenticated Kafka connection, you must have the following:

  • At least one Kafka broker that has SSL enabled and is running one of the following Kafka versions:
    • Apache Kafka version 1.0 or higher
    • Confluent Kafka version 3.0 or higher
  • A Java keystore and truststore on each Kafka broker with all certificates signed by a Certificate Authority (CA).
  • A client private key, a client certificate, and the CA certificate used to sign the client certificate.

Ensure that all client and CA certificates are valid, otherwise this connection will fail SAN validation.

If you haven't yet created a Kafka client keystore to interact with your SSL-enabled Kafka broker, follow the instructions on this page to create a Kafka client keystore and then extract the key and certificates in the format required for the Kafka SSL connection.

If you already have a Kafka client keystore where the RSA algorithm was used to generate the key/certificate pair, and the client certificate was signed by the same CA certificate used to sign the server certificate, then skip to the Export the key and certificates in PEM format and Create the SSL-authenticated Kafka connection in DSP sections on this page.

Create the client keystore

  1. From a command line interface, use the Java keytool command to create the client keystore.
    keytool -keystore client.keystore.jks -alias localhost -validity 365 -genkey -keyalg RSA
    
  2. When prompted, set a password and enter some optional metadata for the keystore. You can set any password and metadata. You can choose to leave the metadata fields blank.

After entering the required information, a client.keystore.jks file is saved in your directory.

Sign the client certificate with a CA certificate

Sign the client certificate with the same CA certificate you used to sign the Kafka server certificate in the server keystore file. In this example, the CA certificate is in PEM format in a file named ca-cert, and the certificate starts with -----BEGIN CERTIFICATE-----. The CA key is in PEM format in a file named ca-key, and the key starts with -----BEGIN ENCRYPTED PRIVATE KEY-----.

  1. Export the client certificate so that it can be signed.
    keytool -keystore client.keystore.jks -alias localhost -certreq -file cert-file
  2. Sign the client certificate with the CA certificate. When running this command, replace <ca-password> with the passphrase for the CA key.
    openssl x509 -req -CA ca-cert -CAkey ca-key -in cert-file -out cert-signed -days 365 -CAcreateserial -passin pass:<ca-password>
  3. Add the CA certificate to the keystore.
    keytool -keystore client.keystore.jks -alias CARoot -import -file ca-cert
  4. Add the signed client certificate to the keystore.
    keytool -keystore client.keystore.jks -alias localhost -import -file cert-signed

Export the key and certificates in PEM format

Export the signed client certificate, the CA certificate that was used to sign it, and the client private key from your keystore. You need to upload these exported files to DSP when creating a Kafka SSL connection.

  1. If your keystore is not already in PKCS12 (.p12) format, then convert it to that format. Make sure to use the same password for both the original keystore and the converted keystore. If you created your keystore based on the instructions in this topic, then you need to convert your keystore from JKS (.jks) format to PKCS12 (.p12) format using the following command.
    keytool -importkeystore -srckeystore client.keystore.jks -destkeystore client.keystore.p12  -srcstoretype jks -deststoretype pkcs12
  2. Export the client private key unencrypted in PEM format, and convert it to RSA format. Save the formatted private key to the file privkey.pem. When running the following command, replace <keystore-password> with the password for your keystore.
    openssl pkcs12 -in client.keystore.p12 -nocerts -nodes -passin pass:<keystore-password> | openssl rsa -out privkey.pem
    
  3. Export the certificates to the file certs.pem. When running the following command, replace <keystore-password> with the password for your keystore.
    openssl pkcs12 -in client.keystore.p12 -nokeys -out certs.pem -passin pass:<keystore-password>
  4. Copy the signed client certificate and the CA certificate into separate files:
    1. Open the certs.pem file in a text editor, and then copy the signed client certificate into a new text file and save it. If you created your keystore and certificates based on the instructions in this topic, the signed client certificate is labeled as friendlyName: localhost.
    2. From the certs.pem file, copy the CA certificate into a new text file and save it. If you created your keystore and certificates based on the instructions in this topic, the CA certificate is labeled as friendlyName: CAroot.

Create the SSL-authenticated Kafka connection in DSP

Now that you have the required certificates and keys, create the Kafka connection in DSP.

  1. In DSP, select the Connections page.
  2. On the Connections page, click Create Connection.
  3. Depending on whether you're using Kafka as a data source or data destination, do one of the following:
    • On the Source tab, select SSL Connector for Kafka Source and then click Next.
    • On the Sink tab, select SSL Connector for Kafka Sink and then click Next.
  4. Complete the following fields:
    Field Description
    Connection Name A unique name for your connection.
    Description (Optional) A description of your connection.
    Kafka Brokers A comma-separated list of your Kafka brokers. You must enter at least one broker. You can enter each broker using the format <scheme>://<host>:<port> or <host>:<port>.
    Client Private Key The file containing the client private key, beginning with -----BEGIN RSA PRIVATE KEY----- and ending with -----END RSA PRIVATE KEY-----. This file is created as privkey.pem during step 2 of Export the key and certificates in PEM format.
    Client Certificate The file containing the client certificate, beginning with -----BEGIN CERTIFICATE----- and ending with -----END CERTIFICATE-----. This file is created using the localhost certificate during step 4a of Export the key and certificates in PEM format.
    CA or Kafka Server Cert The file containing the original CA certificate. You can also use the file that is created using the CAroot certificate during step 4b of Export the key and certificates in PEM format.

    A self-signed certificate will not pass SAN validation. To disable SAN checks, see Disable SAN validation.

    Kafka properties (Optional) Enter any consumer properties by which you want to delimit your data. To enter more than one property, click Add input for every new property you want to add.

    The Kafka source and sink functions that use this connection automatically set a list of Kafka properties that can't be overwritten. See the Kafka properties set by DSP section on this page.

    Any credentials that you upload are transmitted securely by HTTPS, encrypted, and securely stored in a secrets manager.

  5. Click Save.

    If you're editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes.

You can now use your connection in a Kafka source function at the start of your data pipeline to get data from Kafka, or in a Send to Kafka sink function at the end of your pipeline to send data to Kafka.

  • For instructions on how to build a data pipeline, see the Building a pipeline chapter in the Use the Data Stream Processor manual.
  • For information about the source function, see Get data from Kafka in the Function Reference manual.
  • For information about the sink function, see Send data to Kafka in the Function Reference manual.
  • For information about converting the payload of a Kafka record from bytes to a more commonly supported data type such as string, see Deserialize and preview data from Kafka in DSP.

Kafka properties set by DSP

When using the SSL Connector for Kafka, the Kafka and Send to Kafka functions automatically set the following Kafka properties:

ssl.truststore.location=/local/path/to/kafka.client.truststore.jks
ssl.truststore.password=<randomized-password>
ssl.keystore.location=/local/path/to/kafka.client.keystore.jks
ssl.keystore.password=<randomized-password>
ssl.key.password=<randomized-password>
ssl.keystore.type=JKS
ssl.truststore.type=JKS
security.protocol=SSL

You can't overwrite any of the properties in this list when connecting to Kafka using SSL.

Disable SAN validation

If you are using a self-signed certificate, you will need to disable the SAN validation to run your DSP pipeline using the SSL connector for Kafka.

  1. In the Kafka server's server.properties file, add the ssl.endpoint.identification.algorith= property.
  2. In the DSP UI, add ssl.endpoint.identification.algorithm as a consumer property in your Kafka connector. Leave the value field blank.
  3. Restart your Kafka server after both of these changes before activating your pipeline.
Last modified on 04 October, 2022
Create a SASL-authenticated DSP connection to Kafka   Create an unauthenticated DSP connection to Kafka

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters