Splunk® Data Stream Processor

Function Reference

Acrobat logo Download manual as PDF


DSP 1.2.0 is impacted by the CVE-2021-44228 and CVE-2021-45046 security vulnerabilities from Apache Log4j. To fix these vulnerabilities, you must upgrade to DSP 1.2.4. See Upgrade the Splunk Data Stream Processor to 1.2.4 for upgrade instructions.

On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
Acrobat logo Download topic as PDF

To Splunk JSON

This topic describes how to use the function in the Splunk Data Stream Processor.

Description

Formats incoming records to adhere to the Splunk HEC event JSON or the Splunk HEC metric JSON format.

In order to send data to the Splunk platform, you must format your records so that they can be mapped to either the Splunk HEC event JSON or the Splunk HEC metrics JSON schema. See Format event data for Splunk indexes for information on how records are mapped to the HEC event JSON schema. See Format metrics data for Splunk indexes for information on how records are mapped to the HEC metrics JSON schema. Use this function to format incoming records into HEC JSON using those mapping rules. If you want to transform your records into the HEC metrics JSON schema, you must set the kind field to metric.

Function Input/Output Schema

Function Input
collection<record<R>>
This function takes in collections of records with schema R.
Function Output
collection<record<json>>
This function outputs records with a single field json that contains the HEC JSON string.

Syntax

The required syntax is in bold.

to_splunk_json
index=expression<string>
keep_attributes=<bool>

Required arguments

index
Syntax: expression<string>
Description: An expression to get the desired index field.
Example in Canvas View: cast(map_get(attributes, "index"), "string")

Optional arguments

keep_attributes
Syntax: <boolean>
Description: If true, the DSP "attributes" map from events is transformed into the HEC event JSON fields object and will be available as index-extracted fields in the Splunk platform. If there is an index entry in the attributes map, it is ignored and is not added to the JSON fields object.
Default: false
Example in Canvas View: true

Usage

The following is an example of what your records look like after using the to_splunk_json function. Assume that your data looks something like the following snippet, and you've configured the function with the arguments shown in the first SPL2 example.

Record{ 
body="Hello World", source_type="mysourcetype", id="id12345", source="mysource", timestamp=1234567890012, host="myhost", attributes={"attr1":"val1", "index":"myindex"}}

The To Splunk JSON function outputs your records like this:

Record {
"json" = '{"event":"Hello World", "sourcetype":"mysourcetype", "host":"myhost", "index": "myindex", "time":"1234567890.012"}'
}

If you've configured your function with the arguments shown in the second SPL2 example instead, then the To Splunk JSON function outputs your records like this:

Record {
"json" = '{"event":"Hello World", "sourcetype":"mysourcetype", "host":"myhost", "index": "myindex", "time":"1234567890.012", "fields":{"attr1":"val1"}}'

SPL2 examples

Examples of common use cases follow. The following examples in this section assume that you are in the SPL View.

When working in the SPL View you can write the function by providing the arguments in the exact order shown in each use case.

1. Formats incoming records to the HEC event JSON schema

...| to_splunk_json index=cast(map_get(attributes), "index"), "string") |...;

2. Formats incoming records to the HEC event JSON schema with keep_attributes set to true

...| to_splunk_json index=cast(map_get(attributes), "index"), "string") keep_attributes=true |...;
Last modified on 20 April, 2021
PREVIOUS
Time Series Decomposition (beta)
  NEXT
Union

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5, 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters