Splunk® Data Stream Processor

Function Reference

Acrobat logo Download manual as PDF


DSP 1.2.0 is impacted by the CVE-2021-44228 and CVE-2021-45046 security vulnerabilities from Apache Log4j. To fix these vulnerabilities, you must upgrade to DSP 1.2.4. See Upgrade the Splunk Data Stream Processor to 1.2.4 for upgrade instructions.

On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
Acrobat logo Download topic as PDF

Casting

The is strongly and implicitly typed. This means that in order to satisfy the type checker, sometimes data needs to be converted and/or casted to different types. The following scalar functions can be used for type conversions. See data types for information on casting between data types.

cast(input, target_type)

Converts a field from one data type to another data type based on the conversion rules. For common conversions (especially from a string to another basic data type) there often exists a conversion function, and those should be preferred when available. However, conversion functions between types don't always exist (eg. from int to long), and the cast function can always be used as a fall back. If the requested conversion is not supported, null is returned.

The cast function deals with conversion between basic types. To change the types of more complex types such as maps and collections, use ucast.

Function Input
input: InT
target_type: A basic data type. To see the basic data types that supports, see basic data types. The data type is case-sensitive and should be lower-cased.
Function Output
type:OutT

1. SPL2 example

Cast the body field to type string.

When working in the SPL View, you can write the function by using the following syntax.

... | eval body=cast(body, "string") | ...;

2. SPL2 example

Cast the body field to type string. Filters records based on whether ASA-x-xxxxxx matches any value in the body field.

When working in the SPL View, you can write the function by using the following syntax.

... | where match_regex(cast(body, "string"), /%ASA-\d-\d{6}/) | ...;

3. SPL2 example

Alternatively, you can use named arguments to list the arguments in any order.

... | eval body=cast(target_type: "string", input: body) | ...;

ucast(input, target_type, default_value)

Casts data to a new type. Unsafe cast, known as ucast, simply assigns the specified type to the data, and correctness is not checked until run time. If a cast failure occurs at run time, then the value specified in default_value will be returned.

The ucast function provides a way to cast maps and collections, regardless of the data type that the map or collection may contain.

Function Input
input: InT
target_type: A basic or complex data type. To see the data types that supports, see data types. The data type is case-sensitive and should be lower-cased.
default_value: any
Function Output
type:OutT

1. SPL2 example

The following example performs an unsafe cast on the nested_map field in attributes to have type map<string, any>.

When working in the SPL View, you can write the function by using the following syntax.

...| eval n=ucast(map_get(attributes, "nested_map"), "map<string, any>", null) | ...;

2. SPL2 example

Suppose the body field contained a JSON array: {"name":"demo","unit":"percent","type":"GAUGE","value":37,"dimensions":{"region":"us-east-1","sf_hires":"1"}} and you wanted to convert it to JSON. This following example casts body to type collection<any> to return [{"name":"demo","unit":"percent","type":"GAUGE","value":37,"dimensions":{"region":"us-east-1","sf_hires":"1"}}] in field n.

When working in the SPL View, you can write the function by using the following syntax.

...| eval n=ucast(body, "collection<any>", null) | ...;

3. SPL2 example

Alternatively, you can use named arguments to list the arguments in any order.

...| eval n=ucast(input: map_get(attributes, "nested_map"), default_value: null, target_type: "map<string, any>") | ...;
Last modified on 21 April, 2021
PREVIOUS
Overview of evaluation scalar functions
  NEXT
Conditional

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5, 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters