Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

DSP 1.2.1 is impacted by the CVE-2021-44228 and CVE-2021-45046 security vulnerabilities from Apache Log4j. To fix these vulnerabilities, you must upgrade to DSP 1.2.4. See Upgrade the Splunk Data Stream Processor to 1.2.4 for upgrade instructions.

On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Connecting Kafka to your DSP pipeline as a data destination

If you have a Universal license for Splunk Data Stream Processor (DSP), you can connect to an Apache Kafka or Confluent Kafka broker and use it as a data destination. You can get data into a data pipeline, transform it, and then send the transformed data to a Kafka broker. See Licensing for the Splunk Data Stream Processor for information about licensing.

You can also use Kafka as a data source. See Connecting Kafka to your DSP pipeline as a data source for information about this use case.

DSP supports two types of connections for accessing Kafka brokers:

  • SSL-authenticated connections, which are suitable for use in production environments. This type of connection uses two-way SSL authentication, where the client and server authenticate each other using the SSL/TLS protocol.
  • Unauthenticated connections, which should only be used for testing purposes in a secure internal environment.

To connect to Kafka as a data destination, you must complete the following tasks:

  1. If the topic that you want to send data to does not already exist in your Kafka broker, create it.
    • For information about creating a topic in Apache Kafka, search for "Apache Kafka Quickstart" in the Apache Kafka documentation.
    • For information about creating a topic in Confluent Kafka, search for "Quick Start for Apache Kafka" in the Confluent documentation.

    If you try to send data to a topic that does not already exist, the pipeline fails to send data to Kafka and restarts indefinitely.

  2. Create a connection that allows DSP to send data to your Kafka topic.
  3. Create a pipeline that ends with the Send to Kafka sink function. See the Building a pipeline chapter in the Use the manual for instructions on how to build a data pipeline.
  4. Configure the Send to Kafka sink function to use your Kafka connection and send data to an existing Kafka topic. See Send data to Kafka in the Function Reference manual.

When you activate the pipeline, the sink function starts sending data from the pipeline to the specified Kafka topic.

If your data fails to get into Kafka, check the connection settings to make sure you have the correct broker, as well as the correct certificates and keys if you are using an SSL-authenticated connection. DSP doesn't run a check to see if you enter the valid credentials.

Last modified on 04 December, 2020
Connecting Kafka to your DSP pipeline as a data source   Create an SSL-authenticated DSP connection to Kafka

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters