Splunk® Data Stream Processor

DSP Function Reference

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.
Acrobat logo Download topic as PDF

Normalize (projection)

Applies scalar functions on fields from each record in a stream of records and returns a new record with only the provided fields. Use the normalize function to pass-through fields you want to keep or rename from the input record to the output record or set new fields in your record.

When using this function in the UI, you can view the event schema to be normalized by selecting the normalize function and clicking View Configurations. For example, if you are normalizing data from the Ingest REST API, then clicking the View Configurations tab in the normalize function displays the default fields in the event schema. You can normalize fields using field aliases or with an eval expression. This function accepts a variable number of arguments.

Function Input
collection<record<R>>
This function takes in collections of records with schema R.
Function Output
collection<record<S>>
This function outputs the same collection of records but with a different schema S.

API name: projection

Arguments

Argument Input Description UI example
function collection<expression<any>> A variadic list of expressions that become fields in the returned record. In the "UI form," enter a field, in "Original field" or get(field); and field_2 in the "Output" field.

Full DSL example

This example keeps timestamp, id, and owner fields unchanged but renames logGroup and logStream fields. It sets the source and sourcetype fields to be cloudwatch and vpc-flow-logs.

projection(parsed-cloudwatch-vpc-events,
as(get("timestamp"), "timestamp"),
as(get("id"), "id"),
as(get("logGroup"), "log_group"),
as(get("logStream"), "log_stream"),
as(get("owner"), "owner"),
as("cloudwatch", "source"),
as("vpc-flow-logs", "sourcetype"),
);
Last modified on 22 November, 2019
PREVIOUS
Mvexpand
  NEXT
Parse delimited

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.0.0


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters