Splunk® Data Stream Processor

Use the Data Stream Processor

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor reached its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.

All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.
Acrobat logo Download topic as PDF

Test your pipeline configuration

When building a pipeline in the , you can verify whether your pipeline is configured correctly by doing either of the following:

Preview data is not available for sink functions or the Write thru KV Store function. To see the output of those functions, you must preview the output of the pipeline function located immediately before the sink function or Write thru KV Store function.

See the Monitoring chapter in the Install and administer the manual for information about how to monitor active pipelines using the Splunk App for DSP.

You cannot start a preview session for any pipelines containing 100 or more functions.

Test your pipeline with the Function Input and Function Output panels

You can quickly verify the schema, or the fields and field data types, of the data streaming in and out of a particular function by looking at the Function Input and Function Output panels in the Canvas View of a pipeline. Use these panels to verify any schema changes that you made at a particular function. See data types for more information on data types and what they mean.

You can use the drop-down list in the Function Output panel to keep track of what changed at the selected function. The dropdown has the following display options:

  • All fields: Displays all fields and their data types.
  • New fields: Displays fields that were added in the selected function, including fields with new names.
  • Updated fields: Displays fields that have had a type conversion in the selected function.
  • Dropped fields: Displays fields that were removed in the selected function, including renamed fields.
  • Unchanged fields: Displays fields with unchanged names and unchanged data types.

While configuring a function in the Canvas View, best practices are to periodically use the Update button Update button in the Function Input and Function Output panels to make sure that your configuration is valid and the schema of your data is as expected.

How preview sessions work

When you start a preview session, the starts polling for any data sent to the pipeline. The preview session continues until you stop it, or until 15 minutes have passed. During the preview session, the does the following:

  • Collects metrics every second, and shows the most recent numbers in the pipeline functions on the canvas. You can use these metrics to monitor the performance of your pipeline.
  • Samples records from the incoming data, and displays them on the Preview Results tab. The samples up to 100 records for each function in the pipeline. On the Preview Results tab, you can view details such as the schema of the record and the data type and value of each field in the record.

See Navigating the for more information about the DSP UI, including examples of how these metrics and sampled records are displayed during preview sessions.

Some data source functions support the initial_position parameter, which determines the location in the data stream where the starts reading data. If initial_position is set to LATEST, then the samples events starting from the latest position in the data stream. If initial_position is set to TRIM HORIZON, then the samples events starting from the very beginning of the data stream.

Preview the data streaming through a pipeline

Follow these steps to start a preview session.

  1. Open the pipeline that you want to preview.
  2. If the pipeline is active, click Edit to enter the pipeline editing view.
  3. On the pipeline canvas, select a function other than the sink function, and then click the Start Preview Start Preview button button.
    The automatically navigates to the Preview Results tab, which displays the message Polling for preview data once the preview session has started.
  4. If data is not already being sent to your pipeline, send some data now. If you are using the Data Stream Firehose or Ingest Service source functions, you can follow the instructions in Use the Ingest service to send test events to your pipeline.
    After a few seconds, the functions on the canvas start to display metrics indicating the flow of data through the pipeline, and the Preview Results tab starts displaying the data that you sent.
  5. (Optional) Click each function and check the Preview Results tab to see how your data gets transformed as a result of each function in your pipeline. On the Preview Results tab, you can do any of the following:
    • Choose to view your records using a Table format, a List format, or as Raw text. In the Table view, you can do the following:
      • Click and drag the field headings to change how they are ordered in the preview. Changing the order of the fields in the preview does not have any impact on the actual records.
      • Check the symbols beside each field name to confirm the data type of the field. See the tables in data types for information about the data type indicated by each symbol.
    • Choose the number of records to display per page.
    • Enter a filter value so that the Preview Results tab only displays records that contain that value.
  6. (Optional) Click Stop Preview.

See also

Last modified on 11 March, 2022
PREVIOUS
Use the Ingest service to send test events to your pipeline
  NEXT
Masking sensitive data in the

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters