Splunk® Data Stream Processor

Use the Data Stream Processor

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor reached its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.

All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.
Acrobat logo Download topic as PDF

Create custom functions with the SDK

User plugins are custom functions that extend DSP to serve your specific needs. Although DSP ships with an extensive list of functions, it is possible that these existing functions do not meet your exact requirements.

You can use the plugin to extend DSP functionality in the following ways:

  • Connect DSP to a custom data destination by writing your own sink function.
  • Connect DSP to a custom data source by writing your own source function.
  • Perform additional data enrichment on your streaming data by writing your own streaming and scalar functions.

This topic talks about how to register, upload, and use the user plugin examples. You can then use the examples as a reference for developing your own plugin.

Prerequisites

  • You must have Java 8.
  • To develop a plugin for the , you must have familiarity with the following DSP concepts.
Concept Description
Streaming function Streaming functions are functions that operate on a stream of records and are the functions that are visible in the UI. Records stream from one streaming function to the next streaming function and get processed and transformed along the way. Data sources and destinations are also streaming functions and are referred to as "source" and "sink" functions respectively.
Scalar function Scalar functions are functions that operate in the context of the streaming functions they are called in. Unlike streaming functions, scalar functions are not full nodes in a pipeline. You can use scalar functions to do things like addition and subtraction, perform comparison operations, convert between data types, or other similar tasks.
Sink function A special type of streaming function that represents your data destination. A sink function is the last function that you see in a completed data pipeline.
Source function A special type of streaming function that represents your data source. A source function is the first function that you see in a completed data pipeline.

Set up the example user plugins

The following functions are example custom functions that you can reference while creating your own functions.

Function name Function type Description
MapExpand Streaming Generates a new record for each key-value pair in a given map.
VariableWriteLog Sink, streaming Sends data to a Flink Taskmanager log.
JoinStrings Scalar Joins a list of strings, separating each string by a given delimiter.

Setup

Perform the following steps to build the example plugins.

  1. Clone the repository from GitHub: DSP Plugins SDK.
  2. Update the gradle.properties file.
    Field Description
    SCLOUD_TOKEN_FILE Path to your SCloud access token. This file must contain only the access token with no additional metadata.
    PLUGIN_UPLOAD_SERVICE_PROTOCOL http or https
    PLUGIN_UPLOAD_SERVICE_HOST The hostname used to reach the API Service, for example: https://10.130.33.112.
    PLUGIN_UPLOAD_SERVICE_PORT The port associated with your host: 443.
    PLUGIN_UPLOAD_SERVICE_ENDPOINT The DSP plugin endpoint, for example /streams/v3beta1/plugins
  3. Expand the plugin examples to dsp-plugin-functions.
    ./gradlew expandTemplates -PSDK_FUNCTIONS_PATH=dsp-plugin-examples
  4. Build the plugin examples.
    ./gradlew dsp-plugin-examples:build

Register and upload the example user plugins

After building the plugin examples, perform these steps to register and upload the example plugins to DSP.

  1. Register the example plugin.
    ./gradlew registerPlugin -PSDK_PLUGIN_NAME="sdk-examples" -PSDK_PLUGIN_DESC="Template SDK example functions."
    
    #Response:
    Registered plugin: sdk-examples. Response: [pluginId:0613b04d-9269-4e4c-a240-9f415f5514ca, name:sdk-examples, description:Template SDK example functions., tenantId:default, lastUpdateDate:1566498405383, lastUpdateUserId:0, isDeleted:false]
    
  2. List the plugins that you have available.
    ./gradlew getPlugins
  3. Upload the built JAR to the .
    ./gradlew uploadPlugin -PPLUGIN_ID=<id-from-registration-response> -PPLUGIN_MODULE=dsp-plugin-examples
    
  4. Refresh the UI.

You should see the two new streaming functions available for use in your pipeline in the UI. The third function, join_strings, is a scalar function and will appear only in the context of a streaming function.

Use the example plugins

After registering the example plugins, you can now use the plugins (custom functions) in DSP.

  1. From the UI, select Pipelines.
  2. Select Splunk DSP Firehose as your source function.
  3. Click the + icon to add a new function, and select Eval.
  4. Use join_strings in Eval:
    HelloWorld=join_strings(("Hello", "World"), " | ")
  5. Validate your pipeline.
  6. Click Start Preview Start Preview button and send some events to your pipeline.

User plugin templates

The SDK includes templates to help you get started writing your own plugins. The template that you use depends on the type of function that you want to create.

The following templates are located in the templates/src/main/java/.../functions directory.

  • TemplateScalarFunction.java
  • TemplateSinkFunction.java
  • TemplateStreamingFunction.java

Update a plugin

After uploading a new version of a plugin JAR, any functions in the latest version of the plugin will be immediately available for use in new pipelines. Any existing active pipelines will continue to run using functions from the latest JAR version which was present when the pipeline was activated. You can deactivate and reactivate running pipelines to pick up the latest function changes.

If you upload a new version of a plugin JAR containing changes that are incompatible with the previous version (including function name and function signature changes), you need to edit any existing pipelines that use the changed functions before you can validate and activate them.

Last modified on 02 December, 2022
PREVIOUS
Troubleshoot lookups to the Splunk Enterprise KV Store
  NEXT
Troubleshoot the

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.4.0, 1.4.1, 1.4.2, 1.4.3


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters