All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.
Create an unauthenticated DSP connection to Kafka
To get data from an Apache Kafka or Confluent Kafka broker into a data pipeline in Splunk Data Stream Processor, you must first create a connection. You can then use the connection in the Kafka source function to get data from Kafka into a DSP pipeline. If you have a Universal license, you can also create a connection for the Send to Kafka sink function to send data from DSP to a Kafka topic. See Licensing for the Splunk Data Stream Processor in the Install and administer the Data Stream Processor manual.
If you're using the data for testing purposes in a secure internal environment, you can choose to create an unauthenticated connection. Otherwise, create an authenticated connection to protect your data. See Create a SASL-authenticated DSP connection to Kafka and Create an SSL-authenticated DSP connection to Kafka for information about creating an authenticated connection.
Prerequisites
Before you can create an unauthenticated Kafka connection, you must have at least one Kafka broker running one of the following Kafka versions:
- Apache Kafka version 1.0 or higher
- Confluent Kafka version 3.0 or higher
Steps
- In DSP, select the Connections page.
- On the Connections page, click Create Connection.
- Depending on whether you're using Kafka as a data source or data destination, do one of the following:
- On the Source tab, select No-Authentication Connector for Kafka Source and then click Next.
- On the Sink tab, select No-Authentication Connector for Kafka Sink and then click Next.
- Complete the following fields:
Field Description Connection Name A unique name for your Kafka connection. Description (Optional) A description of your connection. Brokers A comma-separated list of your Kafka brokers. You must enter at least one broker. You can enter each broker using the format <scheme>://<host>:<port>
or<host>:<port>
. - Click Save.
If you're editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes.
You can now use your connection in a Kafka source function at the start of your data pipeline to get data from Kafka, or in a Send to Kafka sink function at the end of your pipeline to send data to Kafka.
- For instructions on how to build a data pipeline, see the Building a pipeline chapter in the Use the Data Stream Processor manual.
- For information about the source function, see Get data from Kafka in the Function Reference manual.
- For information about the sink function, see Send data to Kafka in the Function Reference manual.
- For information about converting the payload of a Kafka record from bytes to a more commonly supported data type such as string, see Deserialize and preview data from Kafka in DSP.
Create an SSL-authenticated DSP connection to Kafka | Deserialize and preview data from Kafka in DSP |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6
Feedback submitted, thanks!