Create a Splunk DSP pipeline that processes universal forwarder data
Because the universal forwarder doesn't parse incoming data, except in certain cases, you must use the Key_by and Merge Events functions to properly ingest data from the universal forwarder into your data pipeline. The Data Stream Processor provides a Splunk universal forwarder template so you do not need to construct this pipeline from scratch.
Prerequisites
- A properly configured universal forwarder, as described in send events using a forwarder.
Steps
- From the Build Pipeline page, select the Splunk universal forwarder template.
This template creates a pipeline that reads data from Splunk Forwarders, does the appropriate processing required by the universal forwarder data source, and sends the data to themain
index of the preconfigured Splunk Enterprise instance associated with the Data Stream Processor. - Click Validate and Start Preview to check if your events are passing through your pipeline as-expected.
- (Optional) Verify that your data is successfully being broken up into events by clicking through each function in the pipeline.
- Click on the Key By function to verify that your events are being grouped correctly by
host
,source
,source_type
, andforwarder_channel_id
. - Click on the Merge Events function to verify that your events are being delimited correctly. By default, the Merge Events function uses the regular expression
/([\r\n]+)/
to break incoming data into an event for each line, delimited by any number of carriage returns (\r) or newline (\n) characters.
- Click on the Key By function to verify that your events are being grouped correctly by
Deserialize and send Azure Event Hubs data from a DSP pipeline | Deserialize and preview data from Kafka |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0
Feedback submitted, thanks!