Splunk® Data Stream Processor

Use the Data Stream Processor

Acrobat logo Download manual as PDF

DSP 1.2.0 is impacted by the CVE-2021-44228 and CVE-2021-45046 security vulnerabilities from Apache Log4j. To fix these vulnerabilities, you must upgrade to DSP 1.2.4. See Upgrade the Splunk Data Stream Processor to 1.2.4 for upgrade instructions.

On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.
Acrobat logo Download topic as PDF

SPL2 in the

Users build and configure pipelines in the using the Search Processing Language (SPL2). For details about the SPL2 language, see the SPL2 Search Manual. A DSP pipeline corresponds to an SPL2 program. Users can either construct the program directly in the SPL2 Pipeline Builder or they can construct pipelines incrementally from the Canvas Pipeline Builder. The may refer to various SPL2 language constructs by other names. This page outlines equivalent terminology between SPL2 concepts and those used in the . In the future, these names will be replaced by their equivalent SPL2 counterparts.

Ways to use SPL2 in the

The SPL2 Pipeline Builder uses SPL2 programs to build and configure pipelines. SPL2 programs can contain statements, which supports variable assignments, and must be terminated by a semi-colon. See Create a pipeline using the SPL2 Pipeline Builder.

Pipelines can also be built via a UI using the Canvas Pipeline Builder. With the Canvas Pipeline Builder, you can build a pipeline by incrementally dropping functions into a canvas, and typing SPL2 expressions. See Create a pipeline using the Canvas Builder.


There are terminology differences in SPL2 for the .


The uses the term functions more broadly. A pipeline is constructed entirely from "functions". Certain functions (called "commands" in SPL2) operate on data streams, and are referred to as "streaming commands" or "streaming functions". There are also functions that operate on scalars, such as string manipulations, which are called "functions" in SPL2, or "scalar functions".

In summary, a "function" in the is either:

  • An SPL2 command, or "streaming function" or "streaming command", which operates on data streams.
  • An SPL2 function, or a "scalar function", which operates on scalar data, and are used within a streaming function.

The supports a rich set of functions. Since these functions are used in a stream processing context, some of them are unique to the , and some may be slightly different from similar SPL2 Search commands/functions, For a comprehensive list of functions available, see DSP functions by category.

Sources and Sinks vs. Datasets

SPL2 reads and writes data from Datasets. These are also referred to as source functions or sink functions in the . Source functions and sink functions are a special type of streaming function that represent a data source or a data destination respectively.

Last modified on 09 March, 2022
data types

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5

Was this documentation topic helpful?

You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters