Use the Splunk HEC API to send events and metrics to a DSP pipeline
You can use the Splunk HEC API to send events and metrics to a DSP pipeline via the Read from Splunk Firehose data source.
DSP HEC does not share tokens with Splunk HEC. You must generate a DSP HEC token with the Ingest API and then configure your HTTP clients with the DSP HEC token to send data to the DSP Firehose.
There are 3 Splunk HEC API endpoints you can use to send data to a DSP pipeline, /services/collector
,/services/collector/event
, or /services/collector/event/1.0
. You can send metrics and events to /services/collector
. You can only send events to /services/collector/event
and /services/collector/event/1.0
.
See Set up and use HTTP Event Collector in Splunk Web for more information about setting up and using Splunk HEC.
The following example demonstrates how to use the Splunk HEC API /services/collector
endpoint to send events to a DSP pipeline.
curl -X POST "https://<DSP_HOST>:31000/services/collector" \ -H "Authorization: Splunk <dsphec-token>" \ -d '{ "event": "event1", "sourcetype": "sourcetype1", "time": 1234567890012, "source": "source1", "host": "host1", "fields": { "key1": "value1", "key2": "value2" }, "index": "index1" }'
The following example demonstrates how to use the Splunk HEC API /services/collector
endpoint to send metrics to a DSP pipeline.
curl -X POST "https://<DSP_HOST>:31000/services/collector" \ -H "Authorization: Splunk <dsphec-token>" \ -d '{ "event": "event1", "sourcetype": "sourcetype1", "time": 1234567890012, "source": "source1", "host": "host1", "fields": { "testID": "multiple_metric_record", "key1": "value1", "_value": 123, "metric_name": "total" }, "index": "index1" }
You can use basic authentication with the Splunk HEC API where the username is <dsphec-token-name>
and the password is <dsphec-token>
to send events to DSP HEC. The following example demonstrates how to use basic authentication with the Splunk HEC API /services/collector/event
endpoint to send events to a DSP pipeline.
curl -k -u "<dsphec-token-name>:<dsphec-token>" "https://<DSP_HOST>:31000/services/collector/event" \ -d { "event": "event1", "sourcetype": "sourcetype1", "time": 1234567890012, "source": "source1", "host": "host1", "fields": { "key1": "value1", "key2": "value2" }, "index": "index1" }
Use URL query string parameters with DSP HEC
The following URL query string parameters can be used with DSP HEC to set default field values for all events in a request. The defaults can be overridden by parameters in the JSON request body content.
Parameter | Data type | Description | |
---|---|---|---|
source | string | Sets a default source field value for all events in the request. | |
sourcetype | string | Sets a default sourcetype field value for all events in the request. | |
index | string | Sets a default index field value for all events in the request. |
The following example demonstrates how to use URL query string with the Splunk HEC API /services/collector
endpoint to set the default field values.
https://<DSP_HOST>:31000/services/collector?source=source1&sourcetype=sourcetype1&index=index1
Example workflow: Use Splunk HEC to send data to a DSP pipeline
- Create a new pipeline in DSP. See Create a pipeline using the Canvas Builder for more information on creating a new pipeline.
- Add Read from Splunk Firehose as your data source.
- Add transforms and data destinations as needed in your DSP pipeline.
- Create a DSP HEC token with the Ingest REST API or create a DSP HEC token with SCloud. Make sure you copy the
<dsphec-token>
after it is created. You will need it for the next step. - Update the base URL and token in the HTTP client used in your current Splunk HEC workflow.
- Set the URL to
https://<DSP_HOST>:31000
. - Set the token to
Authorization: Splunk <dsphec-token>
.
- Set the URL to
- Restart your Splunk HEC workflow.
- Use DSP to transform and troubleshoot your data and then send that data to your data destination.
Create and manage DSP HEC tokens with SCloud | Set and use default field values in DSP HEC |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0
Feedback submitted, thanks!