Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

DSP 1.2.1 is impacted by the CVE-2021-44228 and CVE-2021-45046 security vulnerabilities from Apache Log4j. To fix these vulnerabilities, you must upgrade to DSP 1.2.4. See Upgrade the Splunk Data Stream Processor to 1.2.4 for upgrade instructions.

On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Process data from a universal forwarder in DSP

The Splunk universal forwarder sends unparsed data, which means that the data is sent into the in 64-kilobyte blocks. As a result, events that are too long might get truncated, and multiple events might be grouped together as one event. In addition, when DSP receives data from a universal forwarder, the timestamp used is the time when the event was ingested.

Use the Splunk universal forwarder pipeline template to process the data in the following ways:

  • Split the incoming stream of data into separate lines based on the location of a timestamp in the event body.
  • Merge the separated lines into events.
  • Extract the timestamp from the event body and use the extracted value as the timestamp of the event itself.

Prerequisites

Before you can process universal forwarder data in DSP, you must configure the universal forwarder to send data to DSP. See Create a connection between a Splunk forwarder and the Forwarders service.

Steps

  1. From the Data Stream Processor home page, click Build Pipeline and then select the Splunk universal forwarder template.
    This template creates a pipeline that reads data from Splunk forwarders, does the appropriate processing required by the universal forwarder data source, and sends the data to the main index of the preconfigured Splunk Enterprise instance associated with the Data Stream Processor.
  2. To check if your events are passing through your pipeline as expected, do the following:
    1. Click More Options (...) and select Validate.
    2. Click Start Preview.
  3. (Optional) Verify that your data is successfully being broken up into events by clicking through each function in the pipeline.
    • Click the Apply Line Break function to verify that your events are being split and stitched back together properly. The template uses the auto setting for this function, which breaks events based on the location of timestamps in the event body, merges any additional text after the timestamp into the same event, and creates a new event when another timestamp is detected. Timestamps are detected using DSP's built-in timestamp rules.
    • Click the Apply Timestamp Extraction function to verify that your timestamps are being extracted from your event's body properly. By default, the Apply Timestamp Extraction function uses the built-in timestamp rules from DSP and the datetime.xml file from the Splunk instance to detect and extract timestamps. These extracted timestamps are then used as the timestamp value.
  4. Once you've confirmed that your universal forwarder events are being processed in DSP as desired, click More Options (...) and select Save As.
  5. Give your pipeline a name and a description, and select Pipeline from the Save As drop-down list.
  6. (Optional) Continue adding functions to your pipeline to add further transformations to your events.

You now have a pipeline that performs the necessary transformations on universal forwarder events for DSP.

Last modified on 24 April, 2021
Allow DSP users to use the Forwarders service   Connecting your DSP pipeline to a Splunk index

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters