Splunk® Data Stream Processor

Use the Data Stream Processor

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.

All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 before July 1, 2023 in order to continue to receive full product support from Splunk.
Acrobat logo Download topic as PDF

Create a pipeline using a template

You can save any pipeline as a template for reusability.

Use an existing template to create a pipeline

The Splunk Data Stream Processor ships with eight templates.

Template name Description
HTTP API A pipeline that ingests data from the Ingest REST API Service and sends the data to the configured Splunk Enterprise instance associated with the . See About the Ingest REST API.
Splunk Firehose to Splunk Index A pipeline that ingests data from the Splunk Firehose and sends the data to the configured Splunk Enterprise instance associated with the .
Splunk Universal Forwarder A pipeline that ingests data from a Splunk universal forwarder and sends the data to the configured Splunk Enterprise instance associated with the . Events from universal forwarders require additional processing in your pipeline, which is mostly handled by this template. See Process data from a universal forwarder in the .
Amazon Kinesis Data Streams A pipeline that ingests data from a Kinesis stream and sends the data to the configured Splunk Enterprise instance associated with the .
In order to use this template, you need to create an Amazon Kinesis connection.
Azure Event Hubs Source A pipeline that ingests data from an Azure Event Hubs namespace and sends the data to the configured Splunk Enterprise instance associated with the .
In order to use this template, you need to create an Azure Event Hubs connection.

Create a template from scratch

Follow these steps to create a template from scratch.

  1. From the Splunk Data Stream Processor home page, select Templates.
  2. Click Create New Template.
  3. Construct your template. Templates can be full or partial pipelines and can have all or some function arguments filled out.
  4. Click Save to give your template a name, a description, and to save your template for reuse.

After saving, other admins in your tenant can use your template to build their own pipelines.

Create a template from a pre-existing pipeline

Follow these steps to save an existing pipeline as a template.

  1. From the Pipelines page in the UI, find the pipeline that you want to save as a template.
  2. Click the More Options menu, and select Save As.
  3. Give your template a name, description, and select Template from the Save As drop-down list.

After saving, you are taken to the Edit View of the new Template. Any further changes that you make are made on the template rather than your pipeline.

Last modified on 02 November, 2022
PREVIOUS
Create a pipeline using the SPL View
  NEXT
Create a pipeline with multiple data sources

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.4.0, 1.4.1, 1.4.2, 1.4.3


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters