All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.
Create a SASL-authenticated DSP connection to Kafka
To get data from an Apache Kafka or Confluent Kafka broker into a data pipeline in Splunk Data Stream Processor, you must first create a connection. You can then use the connection in the Kafka source function to get data from Kafka into a DSP pipeline. If you have a Universal license, you can also create a connection for the Send to Kafka sink function to send data from DSP to a Kafka topic. See Licensing for the Splunk Data Stream Processor in the Install and administer the Data Stream Processor manual.
To protect your data, create a connection that uses the SASL PLAIN/SCRAM Connector for Kafka. This connector uses username and password authentication, and encrypts all connections using SSL. When using this connector, you can choose to protect your credentials using SCRAM (Salted Challenge Response Authentication Mechanism) if the Kafka broker is configured to support SCRAM.
For information about other methods for connecting DSP to Kafka brokers, see Create an SSL-authenticated DSP connection to Kafka and Create an unauthenticated DSP connection to Kafka.
Prerequisites
Before you can create a SASL-authenticated Kafka connection, you must have at least one Kafka broker that meets the following requirements:
- Runs one of the following Kafka versions:
- Apache Kafka version 1.0 or higher
- Confluent Kafka version 3.0 or higher
- Has SASL authentication enabled
- Has SSL encryption enabled
You must also have a .pem file containing either the server certificate used for SSL encryption or the CA certificate that was used to sign the server certificate. If you don't have this .pem file, ask your Kafka administrator for assistance.
Steps
- In DSP, select the Connections page.
- On the Connections page, click Create Connection.
- Depending on whether you're using Kafka as a data source or data destination, do one of the following:
- On the Source tab, select SASL PLAIN/SCRAM Connector for Kafka Source and then click Next.
- On the Sink tab, select SASL PLAIN/SCRAM Connector for Kafka Sink and then click Next.
- Complete the following fields:
Field Description Connection Name A unique name for your connection. Description (Optional) A description of your connection. Kafka Brokers A comma-separated list of your Kafka brokers. You must enter at least one broker. You can enter each broker using the format <scheme>://<host>:<port>
or<host>:<port>
.SASL Mechanism The SASL mechanism to use for this connection. Select one of the following options: - PLAIN to authenticate the connection using plaintext credentials.
- SCRAM-SHA-256 to authenticate the connection using credentials that are hashed using the SCRAM-SHA-256 mechanism.
- SCRAM-SHA-512 to authenticate the connection using credentials that are hashed using the SCRAM-SHA-512 mechanism.
Username Your username for authenticating to the Kafka broker. Password Your password for authenticating to the Kafka broker. CA or Kafka Server Cert The .pem file containing either the server certificate used for SSL encryption or the CA certificate that was used to sign the server certificate. Kafka Properties (Optional) Enter any consumer properties by which you want to delimit your data. To enter more than one property, click Add input for every new property you want to add. The Kafka source and sink functions that use this connection automatically set a list of Kafka properties that can't be overwritten. See the Kafka properties set by DSP section on this page.
Any credentials that you upload are transmitted securely by HTTPS, encrypted, and securely stored in a secrets manager.
- Click Save.
If you're editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes. When you reactivate a pipeline, you must select where you want to resume data ingestion. See Using activation checkpoints to activate your pipeline in the Use the Data Stream Processor manual for more information.
You can now use your connection in a Kafka source function at the start of your data pipeline to get data from Kafka, or in a Send to Kafka sink function at the end of your pipeline to send data to Kafka.
- For instructions on how to build a data pipeline, see the Building a pipeline chapter in the Use the Data Stream Processor manual.
- For information about the source function, see Get data from Kafka in the Function Reference manual.
- For information about the sink function, see Send data to Kafka in the Function Reference manual.
- For information about converting the payload of a Kafka record from bytes to a more commonly supported data type such as string, see Deserialize and preview data from Kafka in DSP.
Kafka properties set by DSP
When using the SASL PLAIN/SCRAM Connector for Kafka, the Kafka and Send to Kafka functions automatically set the following Kafka properties:
ssl.truststore.location=/local/path/to/kafka.client.truststore.jks ssl.truststore.password=<randomized-password> ssl.truststore.type=JKS security.protocol=SASL_SSL sasl.mechanism=<SASL-mechanism-specified-in-connection>
You can't overwrite any of the properties in this list when connecting to Kafka using SASL.
Connecting Kafka to your DSP pipeline as a data destination | Create an SSL-authenticated DSP connection to Kafka |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6
Feedback submitted, thanks!