Splunk® Data Stream Processor

Use the Data Stream Processor

On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Create a Splunk DSP pipeline that processes universal forwarder data

Because the universal forwarder doesn't parse incoming data, except in certain cases, you must use the Key_by and Merge Events functions to properly ingest data from the universal forwarder into your data pipeline. The Data Stream Processor provides a Splunk universal forwarder template so you do not need to construct this pipeline from scratch.

Prerequisites

Steps

  1. From the Build Pipeline page, select the Splunk universal forwarder template.
    This template creates a pipeline that reads data from Splunk Forwarders, does the appropriate processing required by the universal forwarder data source, and sends the data to the main index of the preconfigured Splunk Enterprise instance associated with the Data Stream Processor.
  2. Click Validate and Start Preview to check if your events are passing through your pipeline as-expected.
  3. (Optional) Verify that your data is successfully being broken up into events by clicking through each function in the pipeline.
    • Click on the Key By function to verify that your events are being grouped correctly by host, source, source_type, and forwarder_channel_id.
    • Click on the Merge Events function to verify that your events are being delimited correctly. By default, the Merge Events function uses the regular expression /([\r\n]+)/ to break incoming data into an event for each line, delimited by any number of carriage returns (\r) or newline (\n) characters.
Last modified on 17 June, 2020
Deserialize and send Azure Event Hubs data from a DSP pipeline   Deserialize and preview data from Kafka

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters