All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.
Connecting multiple data sources to your DSP pipeline
When creating a data pipeline in the , you can choose to connect multiple data sources to the pipeline. For example, you can create a single pipeline that gets data from a Splunk forwarder, a syslog server, and Amazon Kinesis Data Streams concurrently. You can apply transformations to the data from all three data sources as the data passes through the pipeline, and then send the transformed data out from the pipeline to a destination of your choosing.
To connect a pipeline to multiple data sources, you can use either of the following methods:
Method | Description |
---|---|
Start your data pipeline with the Splunk DSP Firehose source function. | The Splunk DSP Firehose source function collects data from a subset of the data sources that DSP supports, and outputs the combined data in a single stream. Use this method if the Splunk DSP Firehose source function supports your data sources, and if you don't need to apply any transformations to the data from each data source before combining the streams. See the rest of this page for more information about the Splunk DSP Firehose. |
Configure a connection and a source function for each data source, and then use a union function to combine the data streams from these source functions into a single stream at the start of your pipeline. | Use this method if the Splunk DSP Firehose function does not support your data sources, or if you want to apply specific transformations to the data streams before combining them. See Get data from Splunk DSP Firehose in the Function Reference manual to confirm if your data source is supported. See Union in the Function Reference manual and the Building a pipeline chapter in the Use the Data Stream Processor manual for more information about using the union function. |
Connecting to multiple data sources using the Splunk DSP Firehose
To connect to multiple data sources using the Splunk DSP Firehose, you must complete the following tasks:
- For each data source that you want to collect data from, create a connection. See Get data from Splunk DSP Firehose in the Function Reference manual to confirm that your data source is supported, and refer to the corresponding chapters in this Connect to Data Sources and Destinations with DSP manual for instructions on creating the connection.
- Create a pipeline that starts with the Splunk DSP Firehose source function.
- See the Building a pipeline chapter in the Use the Data Stream Processor manual for instructions on how to build a data pipeline.
- See Get data from Splunk DSP Firehose in the Function Reference manual for more information about the Splunk DSP Firehose function.
When you activate the pipeline, the Splunk DSP Firehose source function starts collecting data from all the supported data sources that have a valid DSP connection. Each event or metric event is received into the pipeline as a record.
If any data fails to get into the pipeline, check the connection settings to make sure that you have the correct credentials.
Configure SC4S to send syslog data to | Connecting multiple data destinations to your pipeline |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5, 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6
Feedback submitted, thanks!