Stream Processor Service

Connect to Data Sources and Destinations with the Stream Processor Service

Acrobat logo Download manual as PDF


Acrobat logo Download topic as PDF

Data sources and destinations

The can collect data from and send data to a variety of locations, including databases, monitoring services, and pub/sub messaging systems.The location that the collects data from is called a "data source", while the location that the sends data to is called a "data destination".

Each type of data source or destination is supported by a specific Stream Processor Service function. For example, you must use the Splunk forwarders source function to receive data from a Splunk forwarder, and use the Amazon Kinesis Data Streams sink function to write data to a Kinesis stream. Functions that provide read access to data sources are called source functions, while functions that provide write access to data destinations are called sink functions.

The following diagram summarizes the data sources and destinations that the supports:

The Stream Processor Service can collect data from sources such as the Data Stream Firehose, the Forwarders service, the Ingest service, and Amazon Kinesis Data Streams. The Stream Processor Service can send data to destinations such as Splunk Cloud, Splunk Enterprise, Splunk Infrastructure Monitoring, Splunk APM, Amazon Kinesis Data Streams, and Amazon S3

Data collection methods

Depending on the specific type of data source that you are working with, the uses one of the following services or connector types to collect data from it:

Data collection method Description
Ingest service Collects JSON objects from the /events and /metrics endpoints of the Ingest Service.
Forwarder service Collects data from Splunk forwarders.
Stream Processor Service HTTP Event Collector (Stream Processor Service HEC) Collects data from HTTP clients and syslog data sources.
Connectors Supports data from several types of data sources, including Amazon Kinesis Data Stream.
Data Stream Firehose A combined data collection method that reads all data collected by the Forwarder service, the Ingest service, and Stream Processor Service HEC.

Data delivery guarantees

The delivers data from a pipeline to a data destination on an at-least-once delivery basis. If your data destination becomes unreachable, the stops sending data to it. Once the data destination becomes available again, the restarts the affected pipelines and resumes sending data where it left off. If your destination is unavailable for 24 hours or more, then your data is dropped. The only retains received data for up to 24 hours.

Last modified on 19 July, 2021
PREVIOUS
Getting started with Stream Processor Service data connections
  NEXT
Managing connections in the

This documentation applies to the following versions of Stream Processor Service: standard


Was this documentation topic helpful?

You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters