All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.
Connecting syslog data sources to your DSP pipeline
When creating a data pipeline in the , you can get syslog data into your pipeline by routing the data to a Splunk Connect for Syslog (SC4S) instance and then connecting to the SC4S instance as a data source. You can get syslog data into a pipeline, transform the data as needed, and then send the transformed data out from the pipeline to a destination of your choosing.
See Splunk Connect for Syslog for more information about SC4S.
To connect to SC4S as a data source, you must complete the following tasks:
- Configure your SC4S instance to send syslog data to DSP. You'll need to generate a DSP HTTP Event Collector (HEC) token and configure your SC4S instance to use it. See Configure SC4S to send syslog data to DSP.
- Create a pipeline that starts with the Splunk DSP Firehose source function.
- See the Building a pipeline chapter in the Use the Data Stream Processor manual for instructions on how to build a data pipeline.
- See Get data from Splunk DSP Firehose in the Function Reference manual for more information about the source function.
When you activate the pipeline, the source function starts collecting the syslog data that is being routed to the SC4S instance.
Set the default field values in HEC | Configure SC4S to send syslog data to |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5, 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6
Feedback submitted, thanks!