Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

DSP 1.2.1 is impacted by the CVE-2021-44228 and CVE-2021-45046 security vulnerabilities from Apache Log4j. To fix these vulnerabilities, you must upgrade to DSP 1.2.4. See Upgrade the Splunk Data Stream Processor to 1.2.4 for upgrade instructions.

On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Connecting Kafka to your DSP pipeline as a data source

When creating a data pipeline in Splunk Data Stream Processor (DSP), you can connect to an Apache Kafka or Confluent Kafka broker and use it as a data source. You can get data from Kafka into a pipeline, transform the data as needed, and then send the transformed data out from the pipeline to a destination of your choosing.

If you have a Universal license, you can also use Kafka as a data destination. See Connecting Kafka to your DSP pipeline as a data destination for information about this use case. See Licensing for the Splunk Data Stream Processor for information about licensing.

DSP supports two types of connections for accessing Kafka brokers:

  • SSL-authenticated connections, which are suitable for use in production environments. This type of connection uses two-way SSL authentication, where the client and server authenticate each other using the SSL/TLS protocol.
  • Unauthenticated connections, which should only be used for testing purposes in a secure internal environment.

To connect to Kafka as a data source, you must complete the following tasks:

  1. Create a connection that allows DSP to access your Kafka data.
  2. Create a pipeline that starts with the Kafka source function. See the Building a pipeline chapter in the Use the manual for instructions on how to build a data pipeline.
  3. Configure the Kafka source function to use your Kafka connection. See Get data from Kafka in the Function Reference manual.
  4. (Optional) Convert the byte-encoded data from Kafka records into strings that are human-readable during data preview and usable in streaming functions that require string input. See Deserialize and preview data from Kafka in DSP.

When you activate the pipeline, the source function starts collecting data from Kafka. The data is received into the pipeline as a records that contain byte-encoded data values.

If your data fails to get into DSP, check the connection settings to make sure you have the correct broker, as well as the correct certificates and keys if you are using an SSL-authenticated connection. DSP doesn't run a check to see if you enter valid credentials.

Last modified on 04 December, 2020
Create a DSP connection to get metadata from AWS   Connecting Kafka to your DSP pipeline as a data destination

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters