Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

DSP 1.2.0 is impacted by the CVE-2021-44228 and CVE-2021-45046 security vulnerabilities from Apache Log4j. To fix these vulnerabilities, you must upgrade to DSP 1.2.4. See Upgrade the Splunk Data Stream Processor to 1.2.4 for upgrade instructions.

On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Deserialize and preview data from Kafka in DSP

When you create a data pipeline in Splunk Data Stream Processor (DSP) to ingest data from an Apache Kafka or Confluent Kafka topic using the Kafka source function, the ingested data is encoded as bytes. To view the data as human-readable strings during data preview, you must deserialize the data.

Prerequisites

To ingest data from Kafka into a DSP pipeline, you must have a connection to a Kafka broker. See Create an SSL-authenticated DSP connection to Kafka or Create an unauthenticated DSP connection to Kafka.

Steps

  1. From the Data Stream Processor home page, click Build Pipeline and then select Kafka as your source function.
  2. Configure the Kafka function to use your Kafka connection and get data from your Kafka topic. See Get data from Kafka.
  3. On the pipeline canvas, click the + icon next to the Kafka function and then select Eval from the function picker.
  4. On the View Configurations tab, enter the following SPL2 expression in the function field:
    value=to_string(value)
    
  5. Click Start Preview and click the Eval function on the pipeline canvas to confirm that the data in the value field has been deserialized from bytes into strings.
  6. (Optional) Click Stop Preview and continue building your pipeline by adding new functions to it.
Last modified on 04 December, 2020
Create an unauthenticated DSP connection to Kafka   Connecting Apache Pulsar to your DSP pipeline as a data source

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters