Splunk® Data Stream Processor

Use the Data Stream Processor

On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Send data from Splunk DSP to Kafka without authentication

You can use the Apache Kafka Connector (No Authentication) with a Write to Kafka sink function to send data from the Splunk Data Stream Processor (DSP) to an Apache or Confluent Kafka server that does not require SSL/TLS authentication. For information about sending data to SSL-enabled Kafka servers instead, see Send data from Splunk DSP to Kafka using SSL.

You need a Universal license to send data from DSP to a Kafka server. See Licensing for the Splunk Data Stream Processor.

To use a connector, you must create a connection. You can reuse the Kafka connection for both the Read from Kafka source function and the Write to Kafka sink function.

Prerequisites

You must have at least one Kafka server running one of the following Kafka versions:

  • Apache Kafka version 10 or higher
  • Confluent Kafka version 3.0 or higher

Configure a Kafka connection without authentication in the Data Stream Processor UI

Create a connection in the Data Stream Processor UI.

If you are editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes.

  1. Navigate to the Data Stream Processor UI. From the Data Management page, select the Connections tab.
  2. Click Create New Connection.
  3. Select Apache Kafka Connector (No Authentication) and then click Next.
  4. Complete the following fields:
    Field Description
    Connection Name The connection name.
    Description (Optional) A description of your connection.
    Brokers A comma-separated list of your Kafka brokers. You must enter at least one broker. Enter each broker using the format <scheme>://<host>:<port>.
  5. Click Save.

You can now use your Kafka connection with a Write to Kafka sink function to send data to an Apache or Confluent Kafka server.

Last modified on 31 August, 2020
Send data from Splunk DSP to Kafka using SSL   Monitor your pipeline with data preview and real-time metrics

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters