Splunk® Data Stream Processor

Use the Data Stream Processor

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.
Acrobat logo Download topic as PDF

Filtering data in DSP

You can filter out unnecessary records in your data pipeline in the Data Stream Processor (DSP). The filter function passes records that match the filter condition to downstream functions in your pipeline. You can also use filter to route your data to downstream branches in your pipeline.

You can also branch your pipeline and apply a filter to both branches to have one set of data flowing down one branch and a different set down another. In the pipeline below, the filter function is being used to have one set of data with records where the source field is syslog and another set of data with records where the source field isn't syslog. This image shows branching being used with the filter functions to have two sets of data flowing through each branch.

Filter conditions

The following table lists common filter conditionals that you can use.

Predicate Description
eq(get("kind"), "metric"); If the kind field is equal to metric, the record passes downstream. If not, the record doesn't get passed through the pipeline.
like(get("source_type"), "cisco%"); If the source_type field contains the string, "cisco", the record passes downstream. If not, the record doesn't get passed through the pipeline.
gte(get("ttms"), 5000); Here, ttms is a custom top-level field that contains latency information. If the record has a ttms value of over 5 seconds, the record is passed downstream. If not, the record doesn't get passed through the pipeline.
not(eq(get("timestamp"), null)); If the timestamp field is not null, the record passes downstream. If timestamp is null, the record doesn't get passed through the pipeline.

Use a filter function

Add a Filter function to allow only the records that match a specified condition to pass to other downstream functions.

  1. From the Data Pipelines Canvas view, click the + icon and add the Filter function to your pipeline.
  2. In the filter function, specify the filter condition. For examples of filter conditions, see the filter function or the Filter conditions section.
  3. With your Filter function highlighted, click Start Preview to verify that the expression is working as expected.
Last modified on 06 December, 2019
PREVIOUS
Add a sourcetype in DSP
  NEXT
Overview of sending data from DSP to the Splunk platform

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.0.0


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters