All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.
Use the Ingest service to send test events to your pipeline
You can use the Ingest service to send test events to your pipeline.
Send a test event to an Ingest service endpoint
If this is the first time you are using the Ingest service you can use SCloud to test and make sure you can send an event to an Ingest service endpoint.
Prerequisites
- The following steps assume that you have SCloud configured. See authenticate with SCloud for instructions on how to configure SCloud.
Steps
- Login to SCloud.
./scloud login
SCloud doesn't return your login metadata or access token. If you want to see your access token you must log in to SCloud using the verbose flag:
./scloud login --verbose
. - Type the following SCloud commands to test if you can send events and metrics to an ingest endpoint.
- Type the following command to test sending data to the /events endpoint:
./scloud ingest post-events --format raw <<< 'This is a test event.'
- Use this command to test sending data to the /metrics endpoint:
echo '[{"name":"test", "value":1}]' | ./scloud ingest post-metrics
To run the
ingest post-metrics
command, you must use SCloud 4.0.0 or higher.
- Type the following command to test sending data to the /events endpoint:
Send demo data to your pipeline
If you want to quickly send multiple test events into your pipeline, one option is to use the demodata.txt
located in the /examples
folder in your working directory.
Prerequisites
- You can send events and metrics to an Ingest service endpoint.
- You have a pipeline with the Ingest Service or Splunk DSP Firehose as the source function.
Steps
- In the Data Stream Processor, click Pipelines and select a pipeline with one of the prerequisite source functions.
- Click Start Preview .
- From your main working directory, navigate to the
/examples
directory.cd examples
- Move the
demodata.txt
file from/examples
into your main working directory.mv demodata.txt ..
- Navigate back to the main working directory.
cd ..
- Log in to the SCloud CLI with
./scloud login --verbose
. The password is the same one that you use to log in to the Data Stream Processor:sudo ./print-login
to re-print your username and password.Your access token and other metadata is returned. Your access token expires in twelve hours, so log in periodically to refresh it.
- Use SCloud to send events from
demodata.txt
.- Type the following command to send the entire contents of the file to your pipeline. This can take up to a minute to run completely.
cat demodata.txt | while read line; do echo $line | ./scloud ingest post-events --host Buttercup --source syslog --sourcetype Unknown; done
- Type the following command to send a subset of the file to your pipeline.
head -n <number of lines to read> demodata.txt| while read line; do echo $line | ./scloud ingest post-events --host Buttercup --source syslog --sourcetype Unknown; done
- Type the following command to send the entire contents of the file to your pipeline. This can take up to a minute to run completely.
About the Ingest service | Test your pipeline configuration |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6
Feedback submitted, thanks!