Splunk® Data Stream Processor

DSP Function Reference

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.
Acrobat logo Download topic as PDF

Core scalar functions

The following are the primary scalar functions that you can use in your pipeline.

as

Returns INPUT named as NAME.

Function Input
input: T
name: String
Function Output
type:T

DSL example

Adds a new field called event to your data that contains the value of "body".

as(get("body"), "event");

cast

Converts an expression from one data type to another. See casting of data types for casting rules. Returns null if the cast is not supported. If you are trying to cast to a data type that is not supported with the cast function, use a conversion scalar function instead.

Function Input
input: InT
target-type: string
Function Output
type:OutT

DSL example

Converts "1" to type long.

cast(1, "long");

contains_key

Use to determine if a given map contains a given key. Returns true if key is found, otherwise false.

Function Input
type: map<string, T>:string
Function Output
type: boolean

DSL example

contains_key(from_json_object("{\"foo\": [{\"bar\": \"baz\"}]}"), "foo");

create_map

Creates a new map object at runtime. Returns a map of key-value pairs. This function accepts a variable number of arguments.

Function Input
keys-and-values: collection<expression<any>>
Function Output
type:map<string, T>

DSL example

Returns a key-value map where key "id" is mapped to the value of the field "name" and key "unit" is mapped to value "eps".

create_map(
literal("id"), get("name"),
literal("unit"), literal("eps"),
);

get

Provides a lookup ("get") of a value in a single record in the current streaming function.

Function Input
fieldName: string
Function Output
type: T

Note that get("foo") only looks for "foo" in the stream the streaming function is called on:

stream1 = ...
stream2 = ...

// this would only find a field named `foo` from fields of stream1, not stream2
stream1_agg = aggregate(stream1, by:(get("foo")), ...) 

DSL example

Returns the value of body.

get("body");

if

Assigns an expression if the value is true, and another expression if the value is false.

Function Input
predicate: boolean
then: T
else: T
Function Output
type: T

DSL example

If the value of the "kind" field is event, add a new field called main. If the value of the "kind" field is not event, add a new field called metrics.

if(eq(get("kind"), "event"), "main", "metrics");

list

Returns a list that contains the provided arguments.

Function Input
type: collection<T>
Function Output
collection<T>

DSL example

Returns a list of 1, 2, 3.

list(1, 2, 3);

literal

Provides a way to wrap a literal value in a callable function. Calling the returned function returns the value passed to the literal function. This is useful if you want to compile a literal value in the UI or in certain cases with variadic functions. Because variadic functions accept either a value or a function, wrapping a value in a literal turns the value into a function.

Function Input
value: T
Function Output
type:T

DSL example

Return "foo" as a function.

literal("foo");

map_flatten

Accepts a nested map and flattens it with dot-concatenated field names. For example, the function flattens { "foo": {"bar": "baz"}} to { "foo.bar": "baz" }. You can optionally specify a delimiter, for an example, see the second DSL example. Flattening a nested map can simplify DSL syntax and field extraction. For the preceding example, compare the nested versus the flattened DSL:

Nested/Flattened DSL example
Nested map_get(map_get(map_get(nestedMap, "foo"), "bar"), "baz")
Flattened map_get(flattenedMap, "foo.bar.baz")

1. DSL example

Returns { "foo.bar": "baz" }.

as(map_flatten(create_map(literal("foo"), create_map("bar", "baz"))), "output");

2. DSL example

Uses _ as a delimiter, and returns "winlog_record_id" from the custom field value_map. Outputs winlog_record_id in a custom field, value_flattened.

as(map_flatten(get("value_map"), "_"), "value_flattened");

map_get

Returns the value corresponding to a key in the map input.

Function Input
input: map<string, T>
key: string
Function Output
type: T

DSL example

Returns the value of the key "index" from the attributes field map.

map_get(get("attributes"), "index");

map_put

Accepts a variable list of keys and values, which must have a nonzero and even length. It inserts these key-value pairs into the map at the field name location. This function does not add a new field to your event, but allows you to add or update an existing field inside a map within the event. This function accepts a variable number of arguments.

You must use this function in the For each streaming function. This function will not validate in an Eval streaming function.

Function Input
field-name: string
keys-and-values: Collection<T>
Function Output
Record<R>

DSL example

In the attributes field which is a map, set the key "index" to value "metrics".

map_put("attributes", "index", "metrics");

ucast

Provides a way to cast maps and collections, regardless of the data type that the map or collection may contain. You can also use an unsafe cast when the type is not known until runtime. Types are only checked at runtime.

Function Input
input: InT
target-type: string
default-value: any
Function Output
type:OutT

DSL example

Returns type map <string, any>.

ucast(
map_get(get("attributes"), "nested_map"), "map<string, any>", null);
Last modified on 16 January, 2020
PREVIOUS
Sink functions (Data Destinations)
  NEXT
Aggregation

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.0.0


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters