All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.
Adding and updating fields in the
The allows you to change incoming records in real-time giving you increased flexibility and control of your streaming data. You can add or update fields in your data before sending it to a destination. Follow these guidelines when adding or updating fields. If the fields that you want to work with are part of a larger structure, you may need to extract those field values as a prerequisite. See Working with nested data or Extracting fields in events data.
You can also remove certain fields before sending it to a destination. See Remove unwanted fields from your data.
Adding fields in the
To add a new top-level field to your incoming records, use the Eval function. The Eval function adds or updates an existing field in your records. Use the following examples as a guideline for how to add new top-level fields to your data. The following examples assume that your pipeline is streaming records that use the event schema.
Example 1: Adding a new field containing the time that the record passed through the Eval function
In this example, we'll use Eval to add a new top-level field called mystarttimestamp
that contains the timestamp for when the record was detected in the Eval function.
- From the Pipelines page, select the Splunk DSP Firehose data source.
- Create a new top-level field called
mystarttimestamp
containing the time that the record passed through the function.- Add an Eval function to the pipeline.
- Enter the following expression in the function field:
mystarttimestamp=time()
- (Optional) Verify that you now have a top-level field
mystarttimestamp
containing the time that the record passed through the function.
You now have a new top-level field in your data, mystarttimestamp
, containing the time that the record passed through the Eval function.
Example 2: Adding a new field environment
containing which environment the record is from
In this example, we'll use Eval to add a new top-level field called environment
that checks the source
field to determine what environment this record was sent from.
Assume that records with the following source
content are streaming through your pipeline.
- The
source
field from record 1:"source": "playground"
- The
source
field from record 2:"source": "prod"
- The
source
field from record 3:"source": "test"
You can add a top-level field to your schema to make it more apparent which environment the incoming records are coming from. In this example, we'll add a new top-level field called environment
to the streaming records. This top-level field is filled out according to the contents of the source
field: if the source
field contains playground or production, then the corresponding value is placed in the environment
field. If the source
field contains a different value, then the value other is placed in the environment
field instead.
- From the Pipelines page, select the Splunk DSP Firehose data source.
- Create a new top-level field called
environment
which we'll configure to contain the environment that the record is from.- Add an Eval function.
- Enter the following expression in the function field:
environment=if(like(source, "%playground%"), "playground", if(like(source, "%production%"), "production", "other"))
- (Optional) Verify that you now have a top-level field
environment
containing what environment this record was sent from.- Click Start Preview and select the Eval function.
- Log in to SCloud.
./scloud login
- Send a few sample records to your pipeline to verify that your function is working as expected.
./scloud ingest post-events --source playground <<< "Hello"
./scloud ingest post-events --source prod <<< "World"
./scloud ingest post-events --source test <<< "Hello, World!"
You now have a new top-level field in your data, environment
, containing the environment that the record was sent from.
Updating fields in the
There are many different ways that you can manipulate fields in the , and the specific functions that you need to include in your pipeline vary depending on the type of data that you are working with. The following examples demonstrate just a few ways that you can update fields in standard, unnested data. If you want to learn how to update fields that contain nested data, see Working with metrics.
To update a field in your data, use the Eval function. The Eval function updates or adds an existing field in your records. Use the following examples as a guideline for how to update fields in your data. The following examples assume that your pipeline is streaming records that use the event schema.
Example 1: Updating the body field
In this example, we'll use the Eval and concat functions to update the body
field and prepend the string price=
to the body
field.
- From the Pipelines page, select the Splunk DSP Firehose data source.
- Prepend
price=
to the existing body field.- Add an Eval function
- Enter the following expression in the function field:
body=concat("price=", cast(body, "string"))
- (Optional) Verify that your
body
field now has "price=" prepended to it.
The contents of the body
field are now prepended with the string price=
.
Example 2: Updating the source_type field
In this example, we'll use the Eval function to update the source_type
field in your data.
If you do not add a sourcetype to your data and you send your data to Splunk Enterprise, your data is automatically indexed with the default httpevent
sourcetype.
Set a source type on your data with the Eval
streaming function.
- From the Pipelines page, select the Splunk DSP Firehose data source.
- In the Eval function, type the following. This sets your
source_type
field tobuttercup_sales
.
source_type="buttercup_sales"
- (Optional) Verify that the
source_type
field was updated.
Removing fields
For information on how to remove unwanted fields from your data, see Remove unwanted fields from your data.
See also
- Functions
- Eval
- Fields
- String manipulation
- Related topics
- Working with metrics data
- Extracting fields in events data
- SCloud
- data types
Filtering and routing data in the | Extracting fields in events data |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6
Feedback submitted, thanks!