Splunk® Data Stream Processor

Use the Data Stream Processor

DSP 1.2.0 is impacted by the CVE-2021-44228 and CVE-2021-45046 security vulnerabilities from Apache Log4j. To fix these vulnerabilities, you must upgrade to DSP 1.2.4. See Upgrade the Splunk Data Stream Processor to 1.2.4 for upgrade instructions.

On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Create a pipeline using a template

You can save any pipeline as a template for reusability.

Use an existing template to create a pipeline

The Splunk Data Stream Processor ships with eight templates.

Template name Description
HTTP API A pipeline that ingests data from the Ingest REST API Service and sends the data to the configured Splunk Enterprise instance associated with the . See About the Ingest REST API.
Splunk Firehose to Splunk Index A pipeline that ingests data from the Splunk Firehose and sends the data to the configured Splunk Enterprise instance associated with the .
Splunk Universal Forwarder A pipeline that ingests data from a Splunk universal forwarder and sends the data to the configured Splunk Enterprise instance associated with the . Events from universal forwarders require additional processing in your pipeline, which is mostly handled by this template. See Process data from a universal forwarder in the .
Amazon CloudWatch Metrics A pipeline that ingests data from Amazon CloudWatch and sends the data to the configured Splunk Enterprise instance associated with the .
In order to use this template, you need to create a Amazon CloudWatch connection.
Amazon Kinesis Data Streams A pipeline that ingests data from a Kinesis stream and sends the data to the configured Splunk Enterprise instance associated with the .
In order to use this template, you need to create an Amazon Kinesis connection.
AWS S3 A pipeline that ingests data from Amazon S3 and sends the data to the configured Splunk Enterprise instance associated with the .
In order to use this template, you need to create an Amazon S3 connection.
Azure Event Hubs Source A pipeline that ingests data from an Azure Event Hubs namespace and sends the data to the configured Splunk Enterprise instance associated with the .
In order to use this template, you need to create an Azure Event Hubs connection.
Azure Monitor Metrics A pipeline that ingests data from Microsoft Azure Monitor and sends the data to the configured Splunk Enterprise instance associated with the .
In order to use this template, you need to create an Microsoft Azure Monitor connection.

Create a template from scratch

Follow these steps to create a template from scratch.

  1. From the Data Management page in the UI, click Templates.
  2. Click Create New Template.
  3. Construct your template. Templates can be full or partial pipelines and can have all or some function arguments filled out.
  4. Click Save to give your template a name, a description, and to save your template for reuse.

After saving, other admins in your tenant can use your template to build their own pipelines.

Create a template from a pre-existing pipeline

Follow these steps to save an existing pipeline as a template.

  1. From the Data Management page in the UI, find the pipeline that you want to save as a template.
  2. Click the More Options menu, and select Save As.
  3. Give your template a name, description, and select Template from the Save As drop-down list.

After saving, you are taken to the Edit View of the new Template. Any further changes that you make are made on the template rather than your pipeline.

Last modified on 09 March, 2022
Create a pipeline using the SPL2 Pipeline Builder   Create a pipeline with multiple data sources

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters