Splunk® Data Stream Processor

Use the Data Stream Processor

On April 3, 2023, Splunk Data Stream Processor reached its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.

All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.

Adding and updating fields in the

The allows you to change incoming records in real-time giving you increased flexibility and control of your streaming data. You can add or update fields in your data before sending it to a destination. Follow these guidelines when adding or updating fields. If the fields that you want to work with are part of a larger structure, you may need to extract those field values as a prerequisite. See Working with nested data or Extracting fields in events data.

You can also remove certain fields before sending it to a destination. See Remove unwanted fields from your data.

Adding fields in the

To add a new top-level field to your incoming records, use the Eval function. The Eval function adds or updates an existing field in your records. Use the following examples as a guideline for how to add new top-level fields to your data. The following examples assume that your pipeline is streaming records that use the event schema.

Example 1: Adding a new field containing the time that the record passed through the Eval function

In this example, we'll use Eval to add a new top-level field called mystarttimestamp that contains the timestamp for when the record was detected in the Eval function.

  1. From the Pipelines page, select the Splunk DSP Firehose data source.
  2. Create a new top-level field called mystarttimestamp containing the time that the record passed through the function.
    1. Add an Eval function to the pipeline.
    2. Enter the following expression in the function field:
      mystarttimestamp=time()
  3. (Optional) Verify that you now have a top-level field mystarttimestamp containing the time that the record passed through the function.
    1. Click Start Preview Start Preview button and select the Eval function.
    2. Log in to SCloud.
      ./scloud login
    3. Send a sample record to your pipeline to verify that the game card number is being extracted.
      ./scloud ingest post-events <<< "Hello World"

You now have a new top-level field in your data, mystarttimestamp, containing the time that the record passed through the Eval function.

Example 2: Adding a new field environment containing which environment the record is from

In this example, we'll use Eval to add a new top-level field called environment that checks the source field to determine what environment this record was sent from.

Assume that records with the following source content are streaming through your pipeline.

  • The source field from record 1:
     "source": "playground"
  • The source field from record 2:
    "source": "prod"
  • The source field from record 3:
    "source": "test"

You can add a top-level field to your schema to make it more apparent which environment the incoming records are coming from. In this example, we'll add a new top-level field called environment to the streaming records. This top-level field is filled out according to the contents of the source field: if the source field contains playground or production, then the corresponding value is placed in the environment field. If the source field contains a different value, then the value other is placed in the environment field instead.

  1. From the Pipelines page, select the Splunk DSP Firehose data source.
  2. Create a new top-level field called environment which we'll configure to contain the environment that the record is from.
    1. Add an Eval function.
    2. Enter the following expression in the function field:
      environment=if(like(source, "%playground%"), "playground", if(like(source, "%production%"), "production", "other"))
  3. (Optional) Verify that you now have a top-level field environment containing what environment this record was sent from.
    1. Click Start Preview Start Preview button and select the Eval function.
    2. Log in to SCloud.
      ./scloud login
    3. Send a few sample records to your pipeline to verify that your function is working as expected.
      ./scloud ingest post-events --source playground <<< "Hello" 
      ./scloud ingest post-events --source prod <<< "World" 
      ./scloud ingest post-events --source test <<< "Hello, World!" 

You now have a new top-level field in your data, environment, containing the environment that the record was sent from.

Updating fields in the

There are many different ways that you can manipulate fields in the , and the specific functions that you need to include in your pipeline vary depending on the type of data that you are working with. The following examples demonstrate just a few ways that you can update fields in standard, unnested data. If you want to learn how to update fields that contain nested data, see Working with metrics.

To update a field in your data, use the Eval function. The Eval function updates or adds an existing field in your records. Use the following examples as a guideline for how to update fields in your data. The following examples assume that your pipeline is streaming records that use the event schema.

Example 1: Updating the body field

In this example, we'll use the Eval and concat functions to update the body field and prepend the string price= to the body field.

  1. From the Pipelines page, select the Splunk DSP Firehose data source.
  2. Prepend price= to the existing body field.
    1. Add an Eval function
    2. Enter the following expression in the function field:
      body=concat("price=", cast(body, "string"))
  3. (Optional) Verify that your body field now has "price=" prepended to it.
    1. Click Start Preview Start Preview button and select the Eval function.
    2. Log in to SCloud.
      ./scloud login
    3. Send a sample record to your pipeline to verify that your function is working as expected.
      ./scloud ingest post-events <<< '10.5'

The contents of the body field are now prepended with the string price=.

Example 2: Updating the source_type field

In this example, we'll use the Eval function to update the source_type field in your data.

If you do not add a sourcetype to your data and you send your data to Splunk Enterprise, your data is automatically indexed with the default httpevent sourcetype.

Set a source type on your data with the Eval streaming function.

  1. From the Pipelines page, select the Splunk DSP Firehose data source.
  2. In the Eval function, type the following. This sets your source_type field to buttercup_sales.
    source_type="buttercup_sales"
    
  3. (Optional) Verify that the source_type field was updated.
    1. Click Start Preview Start Preview button and select the Eval function.
    2. Log in to SCloud.
      ./scloud login
    3. Send a sample record to your pipeline to verify that your function is working as expected.
      ./scloud ingest post-events <<< 'Hello World'

Removing fields

For information on how to remove unwanted fields from your data, see Remove unwanted fields from your data.

See also

Functions
Eval
Fields
String manipulation
Related topics
Working with metrics data
Extracting fields in events data
SCloud
data types
Last modified on 11 March, 2022
Filtering and routing data in the   Extracting fields in events data

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters