Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

DSP 1.2.1 is impacted by the CVE-2021-44228 and CVE-2021-45046 security vulnerabilities from Apache Log4j. To fix these vulnerabilities, you must upgrade to DSP 1.2.4. See Upgrade the Splunk Data Stream Processor to 1.2.4 for upgrade instructions.

On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Create an unauthenticated DSP connection to Kafka

To get data from an Apache Kafka or Confluent Kafka broker into a data pipeline in Splunk Data Stream Processor (DSP), you must first create a connection. You can then use the connection in the Kafka source function to get data from Kafka into a DSP pipeline. If you have a Universal license, you can also use the connection in the Send to Kafka sink function to send data from DSP to a Kafka topic. See Licensing for the Splunk Data Stream Processor.

If you're using the data for testing purposes in a secure internal environment, you can choose to create an unauthenticated connection. Otherwise, create an SSL-authenticated connection to protect your data. See Create an SSL-authenticated DSP connection to Kafka for information about creating an SSL-authenticated connection.

Prerequisites

Before you can create an unauthenticated Kafka connection, you must have at least one Kafka server running one of the following Kafka versions:

  • Apache Kafka version 10 or higher
  • Confluent Kafka version 3.0 or higher

Steps

  1. From the Data Stream Processor home page, click Data Management ad then select the Connections tab.
  2. Click Create New Connection.
  3. Select No-Authentication Connector for Kafka and then click Next.
  4. Complete the following fields:
    Field Description
    Connection Name A unique name for your Kafka connection.
    Description (Optional) A description of your connection.
    Brokers A comma-separated list of your Kafka brokers. You must enter at least one broker. Enter each broker using the format <scheme>://<host>:<port>.
  5. Click Save.

    If you're editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes. When you reactivate a pipeline, you must select where you want to resume data ingestion. See Using activation checkpoints to activate your pipeline in the Use the Data Stream Processor manual for more information.

You can now use your connection in a Kafka source function at the start of your data pipeline to get data from Kafka, or in a Send to Kafka sink function at the end of your pipeline to send data to Kafka.

  • For instructions on how to build a data pipeline, see the Building a pipeline chapter in the Use the manual.
  • For information about the source function, see Get data from Kafka in the Function Reference manual.
  • For information about the sink function, see Send data to Kafka in the Function Reference manual.
Last modified on 26 February, 2022
Create an SSL-authenticated DSP connection to Kafka   Deserialize and preview data from Kafka in DSP

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters