On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
Create custom functions with the SDK
User plugins are custom functions that extend DSP to serve your specific needs. Although DSP ships with an extensive list of functions, it is possible that these existing functions do not meet your exact requirements.
You can use the plugin to extend DSP functionality in the following ways:
- Connect DSP to a custom data destination by writing your own sink function.
- Connect DSP to a custom data source by writing your own source function.
- Perform additional data enrichment on your streaming data by writing your own streaming and scalar functions.
This topic talks about how to register, upload, and use the user plugin examples. You can then use the examples as a reference for developing your own plugin.
- You must have Java 8.
- To develop a plugin for the , you must have familiarity with the following DSP concepts.
|Streaming function||Streaming functions are functions that operate on a stream of records and are the functions that are visible in the UI. Records stream from one streaming function to the next streaming function and get processed and transformed along the way. Data sources and destinations are also streaming functions and are referred to as "source" and "sink" functions respectively.|
|Scalar function||Scalar functions are functions that operate in the context of the streaming functions they are called in. Unlike streaming functions, scalar functions are not full nodes in a pipeline. You can use scalar functions to do things like addition and subtraction, perform comparison operations, convert between data types, or other similar tasks.|
|Sink function||A special type of streaming function that represents your data destination. A sink function is the last function that you see in a completed data pipeline.|
|Source function||A special type of streaming function that represents your data source. A source function is the first function that you see in a completed data pipeline.|
Set up the example user plugins
The following functions are example custom functions that you can reference while creating your own functions.
|Function name||Function type||Description|
|MapExpand||Streaming||Generates a new record for each key-value pair in a given map.|
|VariableWriteLog||Sink, streaming||Sends data to a Flink Taskmanager log.|
|JoinStrings||Scalar||Joins a list of strings, separating each string by a given delimiter.|
Perform the following steps to build the example plugins.
- Clone the repository from GitHub: DSP Plugins SDK.
- Update the
Field Description SCLOUD_TOKEN_FILE Path to your SCloud access token. This file must contain only the access token with no additional metadata. PLUGIN_UPLOAD_SERVICE_PROTOCOL
PLUGIN_UPLOAD_SERVICE_HOST The hostname used to reach the API Service, for example: https://10.130.33.112. PLUGIN_UPLOAD_SERVICE_PORT The port associated with your host, for example 8443. PLUGIN_UPLOAD_SERVICE_ENDPOINT The DSP plugin endpoint, for example
- Expand the plugin examples to
./gradlew expandTemplates -PSDK_FUNCTIONS_PATH=dsp-plugin-examples
- Build the plugin examples.
Register and upload the example user plugins
After building the plugin examples, perform these steps to register and upload the example plugins to DSP.
- Register the example plugin.
./gradlew registerPlugin -PSDK_PLUGIN_NAME="sdk-examples" -PSDK_PLUGIN_DESC="Template SDK example functions." #Response: Registered plugin: sdk-examples. Response: [pluginId:0613b04d-9269-4e4c-a240-9f415f5514ca, name:sdk-examples, description:Template SDK example functions., tenantId:default, lastUpdateDate:1566498405383, lastUpdateUserId:0, isDeleted:false]
- List the plugins that you have available.
- Upload the built JAR to the .
./gradlew uploadPlugin -PPLUGIN_ID=<id-from-registration-response> -PPLUGIN_MODULE=dsp-plugin-examples
- Refresh the UI.
You should see the two new streaming functions available for use in your pipeline in the UI. The third function,
join_strings, is a scalar function and will appear only in the context of a streaming function.
Use the example plugins
After registering the example plugins, you can now use the plugins (custom functions) in DSP.
- From the UI, select Pipelines.
- Select Splunk DSP Firehose as your source function.
- Click the + icon to add a new function, and select Eval.
HelloWorld=join_strings(("Hello", "World"), " | ")
- Validate your pipeline.
- Click Start Preview and send some events to your pipeline.
User plugin templates
The SDK includes templates to help you get started writing your own plugins. The template that you use depends on the type of function that you want to create.
The following templates are located in the
Update a plugin
After uploading a new version of a plugin JAR, any functions in the latest version of the plugin will be immediately available for use in new pipelines. Any existing active pipelines will continue to run using functions from the latest JAR version which was present when the pipeline was activated. You can deactivate and reactivate running pipelines to pick up the latest function changes.
If you upload a new version of a plugin JAR containing changes that are incompatible with the previous version (including function name and function signature changes), you need to edit any existing pipelines that use the changed functions before you can validate and activate them.
About the Streaming ML Plugin
Upgrade a plugin from 1.1.0 to 1.2.0
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5, 1.3.0, 1.3.1