Splunk® Data Stream Processor

Use the Data Stream Processor

DSP 1.2.1 is impacted by the CVE-2021-44228 and CVE-2021-45046 security vulnerabilities from Apache Log4j. To fix these vulnerabilities, you must upgrade to DSP 1.2.4. See Upgrade the Splunk Data Stream Processor to 1.2.4 for upgrade instructions.

On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Use the Ingest service to send test events to your pipeline

You can use the Ingest service to send test events to your pipeline.

Send a test event to an Ingest service endpoint

If this is the first time you are using the Ingest service you can use SCloud to test and make sure you can send an event to an Ingest service endpoint.

Prerequisites

  • The following steps assume that you have SCloud configured. See authenticate with SCloud for instructions on how to configure SCloud.

Steps

  1. Login to SCloud.
    ./scloud login

    SCloud doesn't return your login metadata or access token. If you want to see your access token you must log in to SCloud using the verbose flag: ./scloud login --verbose.

  2. Type the following SCloud commands to test if you can send events and metrics to an ingest endpoint.
    1. Type the following command to test sending data to the /events endpoint:
      ./scloud ingest post-events --format raw <<<  'This is a test event.'
    2. Use this command to test sending data to the /metrics endpoint:
      echo '[{"name":"test", "value":1}]' | ./scloud ingest post-metrics

      To run the ingest post-metrics command, you must use SCloud 4.0.0 or higher.

Send demo data to your pipeline

If you want to quickly send multiple test events into your pipeline, one option is to use the demodata.txt located in the /examples folder in your working directory.

Prerequisites

Steps

  1. From the Data Management page in the Data Stream Processor UI, select a pipeline with one of the prerequisite source functions.
  2. Click Start Preview.
  3. From your main working directory, navigate to the /examples directory.
    cd examples
  4. Move the demodata.txt file from /examples into your main working directory.
    mv demodata.txt .. 
  5. Navigate back to the main working directory.
    cd ..
  6. Log in to the SCloud CLI with ./scloud login --verbose. The password is the same one that you use to log in to the Data Stream Processor: sudo ./print-login to re-print your username and password.

    Your access token and other metadata is returned. Your access token expires in twelve hours, so log in periodically to refresh it.

  7. Use SCloud to send events from demodata.txt.
    1. Type the following command to send the entire contents of the file to your pipeline. This can take up to a minute to run completely.
      cat demodata.txt | while read line; do echo $line | ./scloud ingest post-events --host Buttercup --source syslog --sourcetype Unknown; done
    2. Type the following command to send a subset of the file to your pipeline.
      head -n <number of lines to read> demodata.txt| while read line; do echo $line | ./scloud ingest post-events --host Buttercup --source syslog --sourcetype Unknown; done
Last modified on 11 March, 2022
About the Ingest service   Test your pipeline configuration with preview sessions

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters