Splunk® Data Stream Processor

Getting Data In

On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Create a connection for the DSP Apache Kafka Connector without authentication

Use the Apache Kafka Connector (No Authentication) to get data from an Apache or Confluent Kafka server that does not require SSL/TLS authentication. For information about getting data from SSL-enabled Kafka servers instead, see Create a connection for the DSP Kafka SSL Connector.

To use a connector, you must create a connection. You can reuse the Kafka connection for both the Read from Kafka source function and the Write to Kafka sink function.

Prerequisites

You must have at least one Kafka server running one of the following Kafka versions:

  • Apache Kafka version 10 or higher
  • Confluent Kafka version 3.0 or higher

Configure a Kafka connection without authentication in the Data Stream Processor UI

Create a connection in the Data Stream Processor UI.

If you are editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes.

  1. Navigate to the Data Stream Processor UI. From the Data Management page, select the Connections tab.
  2. Click Create New Connection.
  3. Select Apache Kafka Connector (No Authentication) and then click Next.
  4. Complete the following fields:
    Field Description
    Connection Name A unique name for your Kafka connection.
    Description (Optional) A description of your connection.
    Brokers A comma-separated list of your Kafka brokers. You must enter at least one broker. Enter each broker using the format <scheme>://<host>:<port>.
  5. Click Save.

You can now use your connection in a data pipeline. See Deserialize and send Kafka data from a DSP pipeline.

Last modified on 31 August, 2020
Create a connection for the DSP Kafka SSL Connector   Get data in with the Collect service and a pull-based connector

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters