All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.
Connecting Kafka to your DSP pipeline as a data destination
If you have a Universal license for Splunk Data Stream Processor, you can connect to an Apache Kafka or Confluent Kafka broker and use it as a data destination. You can get data into a data pipeline, transform it, and then send the transformed data to a Kafka broker. See Licensing for the Splunk Data Stream Processor in the Install and administer the Data Stream Processor manual for information about licensing.
You can also use Kafka as a data source. See Connecting Kafka to your DSP pipeline as a data source for information about this use case.
DSP supports three types of connections for accessing Kafka brokers:
Kafka connection type | Description |
---|---|
SASL-authenticated | Username and password authentication is used. You can choose to protect your credentials using SCRAM (Salted Challenge Response Authentication Mechanism) or leave them in plaintext. The connection is encrypted using SSL.
|
SSL-authenticated | Two-way SSL authentication is used, so that DSP and the Kafka broker authenticate each other using the SSL protocol. Additionally, the connection is encrypted using SSL.
|
Unauthenticated | No authentication takes place between DSP and the Kafka broker. The connection is not encrypted.
|
To connect to Kafka as a data destination, you must complete the following tasks:
- If the topic that you want to send data to does not already exist in your Kafka broker, create it.
- For information about creating a topic in Apache Kafka, search for "Apache Kafka Quickstart" in the Apache Kafka documentation.
- For information about creating a topic in Confluent Kafka, search for "Quick Start for Apache Kafka using Confluent Cloud" in the Confluent documentation.
If you try to send data to a topic that does not already exist, the pipeline fails to send data to Kafka and restarts indefinitely.
- Create a connection that allows DSP to send data to your Kafka topic.
- To create a SASL-authenticated connection, see Create a SASL-authenticated DSP connection to Kafka.
- To create an SSL-authenticated connection, see Create an SSL-authenticated DSP connection to Kafka.
- To create an unauthenticated connection, see Create an unauthenticated DSP connection to Kafka.
- Create a pipeline that ends with the Send to Kafka sink function. See the Building a pipeline chapter in the Use the Data Stream Processor manual for instructions on how to build a data pipeline.
- Configure the Send to Kafka sink function to use your Kafka connection and send data to an existing Kafka topic. See Send data to Kafka in the Function Reference manual.
When you activate the pipeline, the sink function starts sending data from the pipeline to the specified Kafka topic.
If your data fails to get into Kafka, check the connection settings to make sure you have the correct broker, as well as the correct credentials and certificates if you are using an authenticated connection. DSP doesn't run a check to see if you enter valid credentials.
Connecting Kafka to your DSP pipeline as a data source | Create a SASL-authenticated DSP connection to Kafka |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6
Feedback submitted, thanks!