Splunk® Data Stream Processor

Getting Data In

On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Supported ingestion methods and data sources for the Splunk Data Stream Processor

You can get data into your data pipeline in the following ways.

Supported ingestion methods

The following ingestion methods are supported.

Send events using the Ingest Service

Use the Ingest Service to send JSON objects to the /events or the /metrics endpoint. See Format and send events using the Ingest Service. For the Ingest REST API Reference, see the Ingest REST API on the Splunk developer portal.

Send events using the Forwarders service

Send data from a Splunk forwarder to the Splunk Forwarders service. See Send events using a forwarder. For the Forwarders REST API Reference, see the Forwarders REST API on the Splunk developer portal.

Get data in using the Collect service

You can use the Collect service to manage how data collection jobs ingest event and metric data. See Get data in with the Collect service and a pull-based connector. For the Collect REST API Reference, see the Collect REST API on the Splunk developer portal.

Get data in using a connector

A connector connects a data pipeline with an external data source. There are two types of connectors:

Read from Splunk Firehose

Use Splunk Firehose to read data from the Ingest REST API, Forwarders, and Collect services. See Splunk Firehose.

Supported data sinks

Use the Splunk Data Stream Processor to send data to the following sinks, or destinations:

Send data to a Splunk index

See Write to Index and Write to Splunk Enterprise in the Splunk Data Stream Processor Function Reference manual.

Send data to Apache Kafka

See Write to Kafka in the Splunk Data Stream Processor Function Reference manual. Sending data to Apache Kafka requires the DSP universal license. See DSP universal license in the Install and administer the Data Stream Processor manual.

Send data to Amazon Kinesis

See Write to Kinesis in the Splunk Data Stream Processor Function Reference manual. Sending data to Amazon Kinesis requires the DSP universal license. See DSP universal license in the Install and administer the Data Stream Processor manual.

Architecture diagrams

The following diagram summarizes supported data sources and data sinks. This diagram summarizes data sources and sinks that the Splunk Data Stream Processor supports.

The following diagram shows the different services and ways that data can enter your data pipeline. This screen image shows how your data moves from your chosen data sources, into the Data Stream Processor, into a pipeline (Streams service), and then finally to a chosen destination.

Last modified on 01 April, 2020
  Getting data in overview for the Splunk Data Stream Processor

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.0.0


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters