Splunk® Data Stream Processor

Use the Data Stream Processor

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.
Acrobat logo Download topic as PDF

Monitor your pipeline with data preview and real-time metrics

Use data preview and metrics to investigate, monitor, and troubleshoot your pipeline in real time. You can view metrics on both active and inactive pipelines. Viewing metrics on an active pipeline shows you the rate of all events going in and out of each function.

You can preview data on a per-function basis to see how your data gets transformed in different stages of your pipeline. Data preview also shows the schema of your records and the data type of each field in your record. When previewing data, you can choose how to display the order of output records by dragging the record's fields in the order desired. You can also view your records in table, list, or raw views. These options appear once preview data runs through your pipeline.

To view metrics on an inactive pipeline, start a preview session on your pipeline, allowing sample data to run through your data pipeline. The preview data shows a sample of 100 events, but the preview session will continue collecting metrics until it's stopped or timed out after 15 minutes. You can preview data on full pipelines or on partial pipelines, as long as your partial pipeline includes a data source. Preview data is not available for data sink functions or the union function, and preview data is not actually delivered to any specified destinations.

The start of the sample preview data depends on the initial_position of the data source. If the initial_position is set to "LATEST", then the sample events begin from the latest position of the data stream. If the initial_position is set to "TRIM_HORIZON", then the sample events begin at the very start of the data stream.

Preview data on an inactive pipeline

  1. From the Data Pipelines Management page, click on Inactive pipelines and search for the pipeline that you want to view metrics for.
  2. Click on the pipeline that you want to view metrics for, and then click Run Preview to allow sample data in through your data pipeline to preview.
  3. If you are not already sending data to your pipeline, send data to your pipeline.
  4. After a few seconds, live metrics are displayed in each function that shows events per second in and out, bytes per second in and out, and the latency speed that it took to process that event in that function.
  5. Metrics get collected every second, and the most recent numbers are shown in the UI.
  6. (Optional) Click on each function to see how your data gets sent and transformed at each function in your pipeline. This is helpful for troubleshooting to see where your events may be getting stuck or dropped.

View metrics on an active pipeline

  1. From the Data Pipelines Management page, click on Active pipelines and search for the pipeline that you want to view metrics for.
  2. Click on the pipeline that you want to view metrics for to be taken to the Read Only canvas view.
  3. Live metrics are displayed for each function that shows events per second in and out, bytes per second in and out, and the latency speed that it took to process that event in that function.
  4. Metrics get collected every second and the most recent numbers are shown in the UI.

Preview data types

Preview displays the data types of each field in your records. To learn more about DSP data types, see Data Stream Processor Data Types.

Symbol Data Type
# number
a string
B bytes
This screen image shows what the boolean data type renders as in the Data Stream Processor UI. boolean
[] array
{} map
(.*) regex
Last modified on 28 September, 2020
PREVIOUS
Send data from Splunk DSP to Kafka without authentication
  NEXT
Troubleshoot the Data Stream Processor

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters