Splunk® Data Stream Processor

Getting Data In

Acrobat logo Download manual as PDF


This documentation does not apply to the most recent version of DSP. Click here for the latest version.
Acrobat logo Download topic as PDF

Send Syslog events to a DSP data pipeline using SC4S with DSP HEC

You can ingest syslog data into DSP using Splunk Connect for Syslog (SC4S) and the DSP HTTP Event Collector (DSP HEC). See SC4S Configuration Variables for more information about configuring SC4S.

Make sure that the SC4S disk buffer configuration is correctly set up to minimize the number of lost events if the connection to DSP HEC is temporarily unavailable. See Data Resilience - Local Disk Buffer Configuration and SC4S Disk Buffer Configuration for more information on SC4S disk buffering.

Update your SC4S workflow to send syslog data to a DSP pipeline

  1. Create a new pipeline in DSP. See Create a pipeline using the Canvas Builder for more information on creating a new pipeline.
  2. Add Read from Splunk Firehose as your data source.
  3. Add transforms and data destinations as needed in your DSP pipeline.
  4. Use the DSP Ingest API to create a DSP HEC token.
    • You must use sc4s for the <hec-token-name> when you create the DSP HEC token.
    • Make sure you copy the <dsphec-token> after it is created. You will need it for the next step.
  5. Change the following variables in your SC4S configuration:
    • Set SPLUNK_HEC_URL to https://<DSP_HOST>:31000.
    • Set SPLUNK_HEC_TOKEN to <dsphec-token>.
  6. Restart your SC4S workflow.
  7. After you ingest syslog events into DSP, you can format, filter, aggregate or do any data transformations using DSP functions and route data to different destinations. See Sending data from DSP to the Splunk platform and Sending data to other destinations for more information.

Verify that your syslog data is available in your DSP pipeline

You can quickly verify that your syslog data is now available in DSP. You can click Start Preview on your current pipeline or you can create a new pipeline to verify that the data is available.

  1. Create a new pipeline in DSP. See Create a pipeline using the Canvas Builder for more information on creating a new pipeline.
  2. Add Read from Splunk Firehose as your data source.
  3. (Optional) End your pipeline with the Write to the Splunk platform with Batching sink function.
  4. Click Start Preview.

You can now see your syslog data in the pipeline preview. See Navigating the Data Stream Processor for more information on using DSP to transform and manage your data.

This screen image shows a pipeline preview with syslog data.

Last modified on 06 July, 2020
PREVIOUS
Set and use default field values in DSP HEC
  NEXT
Send events to a DSP data pipeline using a Splunk forwarder

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters