Splunk® Data Stream Processor

Use the Data Stream Processor

On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Create a pass-through pipeline to send events

To start sending all your event data to a Splunk index in the configured Splunk Enterprise instance, create a pipeline using the Splunk Firehose to Splunk Index template:


  1. From the Build Pipeline page, click the Splunk Firehose to Splunk Index template.
  2. Click Validate to validate your pipeline. This template sends your data pipeline events or metrics to the main index in the default, pre-configured Splunk Enterprise instance associated with the Data Stream Processor.
  3. Click Preview to start a preview session on your pipeline, allowing sample data to run through your data pipeline.
  4. Give your pipeline a name and a description by clicking the More options menu, and then select Update Pipeline Metadata.
  5. After naming your pipeline, click Update.
  6. Click Save to save your pipeline.
  7. Click Activate to activate your pipeline.
  8. Click the Data Management tab to return to the Pipelines Management page.
Last modified on 18 November, 2019
 

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters