Overview of sending data from DSP to the Splunk platform
You can send data from the Splunk Data Stream Processor (DSP) into a Splunk Enterprise or a Splunk Cloud environment. DSP uses the Splunk HTTP Event Collector (HEC) /services/collector/
endpoint to send data to a Splunk index. There are two functions that send data to the Splunk platform, but they should be used in different circumstances:
- Use the Write to Index function to send data to the default, pre-configured Splunk environment associated with your DSP installation using the default HEC token.
- Use the Write to Splunk Enterprise function to send data to another, external Splunk Enterprise or Splunk Cloud instance that you've configured or to multiple Splunk Enterprise or Splunk Cloud instances.
Both of these sink functions have dynamic logic built-in to determine how your event or metrics data is transformed into an HTTP Event Collector (HEC) event JSON or an metric JSON.
If your data source sends events with the data pipeline event schema or metrics schema, then you don't need to do any extra data transformation to send your data to a Splunk index unless you are using a universal forwarder. Data that has the data pipeline event schema are: data that comes in through the Splunk Forwarder Service, /events
and /metrics
data in the Ingest Service, and data that comes through the Read from Splunk Firehose function.
If your data source doesn't have the data pipeline event schema or metrics schema, follow these steps to send data to a Splunk index.
By default, Splunk HEC endpoints are reachable via SSL and exposed over HTTPS. To securely send data to a HEC endpoint exposed over SSL, confirm with your DSP administrator that the proper environment variables have been set. See Configure the Data Stream Processor to send data to an SSL-enabled Splunk Enterprise instance. If you are using Splunk Enterprise, you can disable SSL on your HEC endpoint by going to Data Inputs > HTTP Event Collector and clicking Global Settings.
Send data to the Splunk platform with the Read from Splunk Firehose source function
If you are receiving streaming data from Splunk Firehose, follow these steps to send data to a Splunk Enterprise index:
- If you are planning to use the Write to Index sink function, skip this step. If you are planning to use the Write to Splunk Enterprise Index sink function, then Create a connection to the Splunk platform in DSP.
- From the Build Pipeline tab, select the Read from Splunk Firehose data source. The Splunk Firehose source function reads streaming data from the DSP Ingest, Collect, and Forwarders services.
- (Optional) If your data pipeline is receiving data from the universal forwarder, you must do additional transformations on your data. See get data from a universal forwarder.
- Click + to add additional desired data transformations on your data pipeline.
- End your pipeline with the Write to Splunk Enterprise or the Write to Index sink function.
- Click Start Preview to verify that your pipeline is sending data to the Splunk Enterprise sink function.
- Save and activate your pipeline.
- After activating, click on View next to your pipeline name to go to the activated View only version of your pipeline. This view allows you to see metrics across your entire pipeline.
- After you see data flowing through your activated pipeline, navigate to the Splunk platform.
- From the Search & Reporting app in the Splunk platform, search for your data:
index="your-index-name"
- (Optional) If your data pipeline is streaming data but it's not showing up in your index, check the HEC dashboards in the Splunk Monitoring Console to make sure that your HEC endpoint is receiving and indexing data.
Send data to the Splunk platform from a source that uses a connector
If your data is coming from any other data source, such as a streaming connector, you might need to perform additional formatting on your data before sending it to a Splunk index.
- If you are planning to use the Write to Index sink function, skip this step. If you are planning to use the Write to Splunk Enterprise sink function, then start by following the steps in create a Splunk Enterprise connection.
- From the Build Pipeline tab, select your data source.
- Format your data so that it is recognizable by the HTTP Event Collector. See the Unique pipeline requirements for specific data sources chapter, in this manual, and select the data source that you have. These topics provide introductory information on formatting data.
- Your events still might require additional formatting. See:
- End your pipeline with the Write to Splunk Enterprise or the Write to Index sink function.
- Click Start Preview to verify that your data is being sent through your pipeline to the Splunk Enterprise sink function.
- Save and activate your pipeline.
- After activating, click on View next to your pipeline name to go to the activated View only version of your pipeline. This view allows you to see metrics across your entire pipeline.
- After you see data flowing through your activated pipeline, navigate to the Splunk platform.
- From the Search & Reporting app in the Splunk platform, search for your data:
index="your-index-name"
- (Optional) If your data is being sent through your data pipeline, but isn't making it through to your Splunk Index, make sure that you are formatting your event data or metrics data correctly. If your data is being formatted correctly, check the HEC dashboards in the Splunk Monitoring Console to make sure that HEC is receiving and indexing data.
Masking sensitive data | Create a connection to the Splunk platform in DSP |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.0.1
Feedback submitted, thanks!