Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

On April 3, 2023, Splunk Data Stream Processor reached its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.

All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.

Get data from HTTP clients into DSP using the Splunk HEC API

You can update an existing Splunk HTTP Event Collector (HEC) workflow to use DSP HEC so that your HTTP clients send data to DSP data pipelines instead of Splunk indexes. See HTTP Event Collector and the for more information about the differences between Splunk HEC and DSP HEC.

To redirect data from your Splunk HEC workflow to the , start by creating a DSP HEC token and configuring your HTTP client to use it. Then, use the Splunk DSP Firehose source function in your pipeline to receive data from your HTTP client. Any data that your HTTP client sends to the Splunk HEC API will go to your DSP pipeline instead of the Splunk platform.

Prerequisites

Before you can update a Splunk HEC workflow to use DSP HEC, you must have the following:

Steps

  1. Update the HTTP client used in your current Splunk HEC workflow so that it starts sending data to DSP:
    1. Set the base URL to https://<DSP_HOST>, where <DSP_HOST> is the IP address of the controller node.
    2. Set the token to Authorization: Splunk <dsp-hec-token>, where <dsp-hec-token> is the token value that was returned when you created your DSP HEC token.
  2. If you are sending GZIP or ZSTD compressed data to your DSP pipeline, set the content encoding to "Content-Encoding: gzip" for GZIP or "Content-Encoding: zstd" for ZSTD.
  3. Restart your Splunk HEC workflow.
  4. In DSP, add the Splunk DSP Firehose source function to the start of your pipeline.

When you activate the pipeline, the source function starts receiving data from the HTTP client. For information about sending your data to the Splunk platform after processing it in DSP, see Connecting your DSP pipeline to the Splunk platform.

See the following sections for examples of how events and metrics can be sent to DSP through Splunk HEC API endpoints. See Set the default field values in DSP HEC for information on how you can specify default values for certain fields in the outgoing event or metric.

Example: Use the Splunk HEC API to send events to a DSP pipeline

The following example demonstrates how to use the Splunk HEC API /services/collector endpoint to send events to a DSP pipeline. In this example, <DSP_HOST> and <dsp-hec-token> are placeholders for the IP address of your controller node and your DSP HEC token, respectively.

curl -X POST "https://<DSP_HOST>/services/collector" \
   -H "Authorization: Splunk <dsp-hec-token>" \
   -d '{ 
         "event": "event1", 
         "time": 1505333071.247, 
         "index": "index1", 
         "source": "source1", 
         "sourcetype": "sourcetype1", 
         "host": "example.host.com", 
         "fields": {
                  "region": "northeast",
                  "datacenter": "dc",
                  "rack": "62",
                  "os": "Ubuntu18.04",
                  "arch": "x64"
                  }, 
         }'

Example: Use the Splunk HEC API to send metrics to a DSP pipeline

See Get started with metrics in the Splunk Enterprise Metrics manual for more information about using metrics with the Splunk platform.

The following example demonstrates how to use the Splunk HEC API /services/collector endpoint to send metrics to a DSP pipeline. In this example, <DSP_HOST> and <dsp-hec-token> are placeholders for the IP address of your controller node and your DSP HEC token, respectively.

curl -X POST "https://<DSP_HOST>/services/collector" \
   -H "Authorization: Splunk <dsp-hec-token>" \
   -d '{
         "time": 1505333071.247, 
         "index": "index1",
         "source": "source1", 
         "sourcetype": "sourcetype1", 
         "host": "example.host.com", 
         "fields": {
                  "region": "northeast",
                  "datacenter": "dc",
                  "rack": "62",
                  "os": "Ubuntu18.04",
                  "arch": "x64",        
                  "metric_name": "cpu.user",
                  "_value": 0.5
                  },
         }

Example: Use the Splunk HEC API to send multiple metrics to a DSP pipeline

Version 8.0.0 and higher of the Splunk platform supports a JSON format which allows each JSON object to contain measurements for multiple metrics. You can use this multiple-metric JSON format with your DSP pipelines. See The multiple-metric JSON format in the Splunk Enterprise Metrics manual for more information.

The following example demonstrates how to use the Splunk HEC API /services/collector endpoint to send multiple metrics to a DSP pipeline. In this example, <DSP_HOST> and <dsp-hec-token> are placeholders for the IP address of your controller node and your DSP HEC token, respectively.

curl -X POST "https://<DSP_HOST>/services/collector" \
   -H "Authorization: Splunk <dsp-hec-token>" \
   -d '{
         "time": 1505333071.247, 
         "index": "index1",
         "source": "source1", 
         "sourcetype": "sourcetype1", 
         "host": "example.host.com", 
         "fields": {
                  "region": "northeast,
                  "datacenter": "dc",
                  "rack": "62",
                  "os": "Ubuntu18.04",
                  "arch": "x64",
                  "metric_name:cpu.idle": 28.7261266542177580,
                  "metric_name:cpu.usr": 28.7261266542177580,
                  "metric_name:cpu.sys": 28.7261266542177580
                  },
         }

Example: Use basic authentication with the Splunk HEC API to send events to a DSP pipeline

You can use basic authentication, where the username is any alphanumeric string and the password is your DSP HEC token, to send events to a DSP pipeline through the Splunk HEC API.

The following example demonstrates how to use basic authentication with the Splunk HEC API /services/collector/event endpoint to send events to a DSP pipeline. In this example, <username> and <dsp-hec-token> are placeholders for your username and your DSP HEC token, respectively. <DSP_HOST> is a placeholder for the IP address of your controller node.

curl -k -u  "<username>:<dsp-hec-token>" "https://<DSP_HOST>/services/collector/event" \
    -d { 
        "event": "event1", 
         "time": 1505333071.247, 
         "index": "index1", 
         "source": "source1", 
         "sourcetype": "sourcetype1", 
         "host": "example.host.com", 
         "fields": {
                  "region": "northeast",
                  "datacenter": "dc",
                  "rack": "62",
                  "os": "Ubuntu18.04",
                  "arch": "x64"
                  }, 
         }'

Example: Use the Splunk HEC API to send compressed data to a DSP pipeline

The following example demonstrates how to use the Splunk HEC API /services/collector endpoint to send data compressed with GZIP to a DSP pipeline. This example assumes that your events data is already compressed with GZIP and the file name is events.gz.

curl -X POST "https://<DSP_HOST>/services/collector" \
   -H "Authorization: Splunk <dsphec-token>" \
   -H "Content-Encoding: gzip" \
   -d @events.gz

The following example demonstrates how to use the Splunk HEC API /services/collector endpoint to send data compressed with ZSTD to a DSP pipeline. This example assumes that your events data is already compressed with ZSTD and the file name is events.zstd.

curl -X POST "https://<DSP_HOST>/services/collector" \
   -H "Authorization: Splunk <dsphec-token>" \
   -H "Content-Encoding: zstd" \
   -d @events.zstd
Last modified on 13 January, 2023
HTTP Event Collector and the   Create and manage HEC tokens through the Ingest Service

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters