Splunk® Data Stream Processor

Function Reference

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.
Acrobat logo Download topic as PDF

Parse delimited

Parses a CSV or TSV file from a delimited text file using a delimiter. Use this function in conjunction with the File source function.

Function Input
collection<record<R>>
This function takes in collections of records with schema R.
Function Output
collection<record<S>>
This function outputs the same collection of records but with a different schema S.

Arguments

Argument Input Description UI example
body expression<string> An expression that contains the body of the record to be parsed. get("body");
field-delimiter string A delimiter that separates the fields in the static file. ,
header string A delimited list of field header names, use the same delimiter as the field delimiter argument. host,source

Full DSL example

This example parses the field body with host and source headers:

parse-delimited(events, get("body"), ",", "host,source");
Last modified on 02 January, 2020
PREVIOUS
Normalize (projection)
  NEXT
Parse regex (rex)

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.0.1


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters