Splunk® Data Stream Processor

Use the Data Stream Processor

On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Create a pipeline using a template

You can save any pipeline as a template for reusability.

Use an existing template to create a pipeline

The Splunk Data Stream Processor ships with eight templates.

Template name Description
HTTP API A pipeline that ingests data from the Ingest REST API Service and sends the data to the configured Splunk Enterprise instance associated with the Data Stream Processor. See Format and send events using the Ingest REST API.
Splunk Firehose to Splunk Index A pipeline that ingests data from the Splunk Firehose and sends the data to the configured Splunk Enterprise instance associated with the Data Stream Processor.
Splunk Universal Forwarder A pipeline that ingests data from a Splunk universal forwarder and sends the data to the configured Splunk Enterprise instance associated with the Data Stream Processor. Events from universal forwarders require additional processing in your pipeline, which is mostly handled by this template. You still need to provide a delimiter for your log files. See Send data using a Universal Forwarder.
Amazon CloudWatch Metrics A pipeline that ingests data from CloudWatch Metrics and sends the data to the configured Splunk Enterprise instance associated with the Data Stream Processor.
In order to use this template, you need to create a CloudWatch Metrics connection.
Amazon Kinesis Data Streams A pipeline that ingests data from a Kinesis stream and sends the data to the configured Splunk Enterprise instance associated with the Data Stream Processor.
In order to use this template, you need to create an Amazon Kinesis connection.
AWS S3 A pipeline that ingests data from Amazon S3 and sends the data to the configured Splunk Enterprise instance associated with the Data Stream Processor.
In order to use this template, you need to create an Amazon S3 connection.
Azure Event Hubs Source A pipeline that ingests data from an Azure Event Hubs namespace and sends the data to the configured Splunk Enterprise instance associated with the Data Stream Processor.
In order to use this template, you need to create an Azure Event Hubs connection.
Azure Monitor Metrics A pipeline that ingests data from Microsoft Azure Monitor and sends the data to the configured Splunk Enterprise instance associated with the Data Stream Processor.
In order to use this template, you need to create an Azure Monitor Metrics connection.

Create a template from scratch

Follow these steps to create a template from scratch.

  1. From the Data Management page in the Splunk DSP UI, click Templates.
  2. Click Create New Template.
  3. Construct your template. Templates can be full or partial pipelines and can have all or some function arguments filled out.
  4. Click Save to give your template a name, a description, and to save your template for reuse.

After saving, other admins in your tenant can use your template to build their own pipelines.

Create a template from a pre-existing pipeline

Follow these steps to save an existing pipeline as a template.

  1. From the Data Management page in the Splunk DSP UI, find the pipeline that you want to save as a template.
  2. Click the More Options menu, and select Save As.
  3. Give your template a name, description, and select Template from the Save As drop-down list.

After saving, you are taken to the Edit View of the new Template. Any further changes that you make are made on the template rather than your pipeline.

Last modified on 17 June, 2020
Create a pipeline using the SPL2 Pipeline Builder   Create a pipeline with multiple data sources

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters