Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

Acrobat logo Download manual as PDF

Acrobat logo Download topic as PDF

Connecting Apache Pulsar to your DSP pipeline as a data source

When creating a data pipeline in Splunk Data Stream Processor (DSP), you can connect to an Apache Pulsar cluster and use it as a data source. You can get data from Pulsar into a pipeline, transform the data as needed, and then send the transformed data out from the pipeline to a destination of your choosing.

To connect to Kafka as a data source, you must complete the following tasks:

  1. Create a connection that allows DSP to access your Pulsar data. See Create a DSP connection to Apache Pulsar.
  2. Create a pipeline that starts with the Apache Pulsar source function. See the Building a pipeline chapter in the Use the manual for instructions on how to build a data pipeline.
  3. Configure the Apache Pulsar source function to use your Pulsar connection. See Get data from Apache Pulsar in the Function Reference manual.

When you activate the pipeline, the source function starts collecting data from Pulsar.

If your data fails to get into DSP, check the connection settings to make sure you have the correct service URL, SSL certificates, and client private key for your Pulsar cluster. DSP doesn't run a check to see if you enter valid credentials.

Last modified on 04 December, 2020
Deserialize and preview data from Kafka in DSP
Create a DSP connection to Apache Pulsar

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1

Was this documentation topic helpful?

You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters