All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.
All (DSP) pipelines can be represented in JSON. The JSON representation of a pipeline is also known as Streams JSON. You can back up a pipeline by saving the JSON associated with the pipeline. Currently, there is no pipeline versioning mechanism, so you may find it helpful to manually back up and save your pipelines in a version control system in case something unexpected happens.
Create a backup of a pipeline
Follow these steps to create a backup of a pipeline. This is useful in case you want to share a pipeline with someone outside of your organization or you want to save an older version of a pipeline for troubleshooting purposes.
Export a backup of your pipeline, either for storage outside of your DSP environment or to share with someone outside of your organization:
- From the home page, select Pipelines and find the pipeline that you want to save a copy of.
- Click the pipeline to view it in the Canvas View. If the pipeline is active, click Edit before continuing.
- Click the options button , and click Export pipeline.
- Save the downloaded JSON file to your preferred location for storing backups.
Create a backup of your pipeline by cloning it:
- From the home page, click Pipelines and find the pipeline that you want to save a copy of.
- Click the options button in the pipeline row, and click Clone.
- Assign a name to the cloned pipeline and click Save. Make any modifications that you want to the cloned pipeline, and maintain the original pipeline as a precautionary measure.
You now have a backup of a pipeline that you can share with someone outside of your organization or keep in case something goes wrong with your pipeline.
Restore a pipeline
You can restore any exported pipelines by importing the pipeline.
- From the Pipelines page, select a Source to enter the Canvas View.
- From the Canvas View, click the options button , and click Import pipeline.
- Browse to and select the JSON file that you previously exported. Click Import.
- Give your pipeline a name by clicking the pencil button . You can additionally give your pipeline a description by clicking the pipeline options button and then selecting Update Pipeline Metadata.
- Click Save.
Backup and restore a pipeline using the Splunk Cloud Services CLI
You can back up and restore a pipeline using the Splunk Cloud Services CLI. Use the CLI to export the Streams JSON of the pipeline, and then recreate the pipeline based on the exported Streams JSON. The Streams JSON is an abstract representation of a data pipeline using JSON syntax.
The following steps assume that you have jq
installed.
- From the command line, log in to the Splunk Cloud Services CLI.
./scloud login
- To back up a pipeline by exporting the corresponding Streams JSON, do the following:
- List the pipelines that are currently in your tenant.
./scloud streams list-pipelines
- Find the pipeline that you want to create, and copy its
id
. - Export the Streams JSON of the pipeline. The following command saves the underlying Streams JSON associated with your pipeline in a designated JSON file.
./scloud streams get-pipeline --id <ID> | jq .data > <pipeline-name>.json
- List the pipelines that are currently in your tenant.
- To restore a pipeline by recreating it based on the Streams JSON, run the following command.
./scloud streams create-pipeline --name "<pipeline-name>" --input-datafile <path-to-json-file> --bypass-validation true
Send data from a pipeline to multiple destinations | Using activation checkpoints to activate your pipeline |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6
Feedback submitted, thanks!