Create a Splunk DSP pipeline that processes universal forwarder data
Because the universal forwarder doesn't parse incoming data, except in certain cases, you must use the Group by and Merge Events functions to properly ingest data from the universal forwarder into your data pipeline. The Data Stream Processor provides a Splunk universal forwarder template so you do not need to construct this pipeline from scratch.
Prerequisites
- A properly configured universal forwarder, as described in send events using a forwarder.
Steps
- From the Build Pipeline page, select the Splunk universal forwarder template.
This template creates a pipeline that reads data from Splunk Forwarders, does the appropriate processing required by the universal forwarder data source, and sends the data to themain
index of the preconfigured Splunk Enterprise instance associated with the Data Stream Processor. - Most of the pipeline is preconfigured for you, but you'll need to provide a regular expression delimiter to correctly stitch your events together. Click the Merge Events function.
- In the Delimiter text box, use a Java 8 regular expression to express a delimiter for your log files. For example, if your log file events look like:
2018-12-18 15:09:00,144 log event 1 log event 1 continue log event 1 continue log event 1 continue 2018-12-18 15:09:01,144 log event 2
then you can put the following regular expression to correctly group the events together.(\\n)[0-9]{4}-[0-9]{2}-[0-9]{2}
This results in your events being broken correctly into two events:
2018-12-18 15:09:00,144 log event 1 log event 1 continue log event 1 continue log event 1 continue 2018-12-18 15:09:01,144 log event 2
- Click Validate and Start Preview to check if your events are passing through your pipeline as-expected.
Deserialize and send Azure Event Hubs data from a DSP pipeline | Deserialize and send Kafka data from a DSP pipeline |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.0.1
Feedback submitted, thanks!