On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
Connecting Kafka to your DSP pipeline as a data destination
If you have a Universal license for Splunk Data Stream Processor (DSP), you can connect to an Apache Kafka or Confluent Kafka broker and use it as a data destination. You can get data into a data pipeline, transform it, and then send the transformed data to a Kafka broker. See Licensing for the Splunk Data Stream Processor for information about licensing.
You can also use Kafka as a data source. See Connecting Kafka to your DSP pipeline as a data source for information about this use case.
DSP supports two types of connections for accessing Kafka brokers:
- SSL-authenticated connections, which are suitable for use in production environments. This type of connection uses two-way SSL authentication, where the client and server authenticate each other using the SSL/TLS protocol.
- Unauthenticated connections, which should only be used for testing purposes in a secure internal environment.
To connect to Kafka as a data destination, you must complete the following tasks:
- If the topic that you want to send data to does not already exist in your Kafka broker, create it.
- For information about creating a topic in Apache Kafka, search for "Apache Kafka Quickstart" in the Apache Kafka documentation.
- For information about creating a topic in Confluent Kafka, search for "Quick Start for Apache Kafka" in the Confluent documentation.
If you try to send data to a topic that does not already exist, the pipeline fails to send data to Kafka and restarts indefinitely.
- Create a connection that allows DSP to send data to your Kafka topic.
- To create an SSL-authenticated connection, see Create an SSL-authenticated DSP connection to Kafka.
- To create an unauthenticated connection, see Create an unauthenticated DSP connection to Kafka.
- Create a pipeline that ends with the Send to Kafka sink function. See the Building a pipeline chapter in the Use the manual for instructions on how to build a data pipeline.
- Configure the Send to Kafka sink function to use your Kafka connection and send data to an existing Kafka topic. See Send data to Kafka in the Function Reference manual.
When you activate the pipeline, the sink function starts sending data from the pipeline to the specified Kafka topic.
If your data fails to get into Kafka, check the connection settings to make sure you have the correct broker, as well as the correct certificates and keys if you are using an SSL-authenticated connection. DSP doesn't run a check to see if you enter the valid credentials.
Connecting Kafka to your DSP pipeline as a data source | Create an SSL-authenticated DSP connection to Kafka |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5
Feedback submitted, thanks!