Splunk® Data Stream Processor

Getting Data In

Download manual as PDF

Download topic as PDF

Supported ingestion methods and data sources for the Splunk Data Stream Processor

You can get data into your data pipeline in the following ways.

Supported ingestion methods

The following ingestion methods are supported.

Send events using the Ingest REST API

Use the Ingest REST API to send JSON objects to the /events or the /metrics endpoint. See Format and send events using the Ingest REST API.

Send events using the Forwarders service

Send data from a Splunk forwarder to the Splunk Forwarders service. See Send events using a forwarder.

Get data in using the Collect service

You can use the Collect service to manage how data collection jobs ingest event and metric data. See the Collect service documentation.

Get data in using a connector

A connector connects a data pipeline with an external data source. There are two types of connectors:

Read from Splunk Firehose

Use Splunk Firehose to read data from the Ingest REST API, Forwarders, and Collect services. See Splunk Firehose.

Supported data sinks

Use the Splunk Data Stream Processor to send data to the following sinks, or destinations:

Send data to a Splunk index

See Write to Index and Write to Splunk Enterprise in the Splunk Data Stream Processor Function Reference manual.

Send data to Apache Kafka

See Write to Kafka in the Splunk Data Stream Processor Function Reference manual. Sending data to Apache Kafka requires the DSP universal license. See DSP universal license in the Install and administer the Data Stream Processor manual.

Send data to Amazon Kinesis

See Write to Kinesis in the Splunk Data Stream Processor Function Reference manual. Sending data to Amazon Kinesis requires the DSP universal license. See DSP universal license in the Install and administer the Data Stream Processor manual.

Architecture diagrams

The following diagram summarizes supported data sources and data sinks. This diagram summarizes data sources and sinks that the Splunk Data Stream Processor supports.

The following diagram shows the different services and ways that data can enter your data pipeline. This screen image shows how your data moves from your chosen data sources, into the Data Stream Processor, into a pipeline (Streams service), and then finally to a chosen destination.

  NEXT
Getting data in overview for the Splunk Data Stream Processor

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.0.0


Was this documentation topic helpful?

Enter your email address, and someone from the documentation team will respond to you:

Please provide your comments here. Ask a question or make a suggestion.

You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters