Splunk® Data Stream Processor

Use the Data Stream Processor

On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Create custom functions with the DSP SDK

User plugins are JAR files that contain one or more custom DSP functions that extend DSP to serve your specific needs. Although the Data Stream Processor ships with an extensive list of functions, it is possible that these existing functions do not meet your exact requirements. You can write your own custom functions, register them as a plugin, and upload the plugin JAR to the Data Stream Processor.

This topic talks about how to register, upload, and use the user plugin examples using Gradle.

Prerequisites:

  • Java 8

User plugin examples

The following functions are example custom functions that you can reference while creating your own functions.

Function name Function type Description
MapExpand Streaming Generates a new record for each key-value pair in a given map.
VariableWriteLog Sink, streaming Sends data to a Flink Taskmanager log.
JoinStrings Scalar Joins a list of strings, separating each string by a given delimiter.

Setup

Perform the following steps to build the example plugins.

  1. Clone the repository from GitHub: DSP Plugins SDK. Check out the git tag that corresponds to the version of DSP that you are using, for example, release-1.1.
  2. Update the gradle.properties file.
    Field Description
    SCLOUD_TOKEN_FILE Path to your SCloud access token. This file must contain only the access token with no additional metadata.
    PLUGIN_UPLOAD_SERVICE_PROTOCOL http or https
    PLUGIN_UPLOAD_SERVICE_HOST The hostname or IP address used to reach the Data Stream Processor API Service, for example: 10.130.33.112.
    PLUGIN_UPLOAD_SERVICE_PORT The port associated with your host, for example 31000.
  3. Build the plugin examples.
    ./gradlew dsp-plugin-examples:build

Register and upload the example user plugins

After building the plugin examples, perform these steps to register and upload the example plugins to DSP.

  1. Register the example plugin.
    ./gradlew registerPlugin -PSDK_PLUGIN_NAME="sdk-examples" -PSDK_PLUGIN_DESC="Template SDK example functions."
    
    #Response:
    Registered plugin: sdk-examples. Response: [pluginId:0613b04d-9269-4e4c-a240-9f415f5514ca, name:sdk-examples, description:Template SDK example functions., tenantId:default, lastUpdateDate:1566498405383, lastUpdateUserId:0, isDeleted:false]
    
  2. List the plugins that you have available.
    ./gradlew getPlugins
  3. Upload the built JAR to the Data Stream Processor.
    ./gradlew uploadPlugin -PPLUGIN_ID=<id-from-registration-response> -PPLUGIN_MODULE=dsp-plugin-examples
    
  4. Refresh the Data Stream Processor UI.
  5. (Optional) You can also delete a plugin when you are done using them.
    ./gradlew deletePlugin -PPLUGIN_ID=<id>

You should see the two new streaming functions available for use in your pipeline in the Data Stream Processor UI. The third function, join_strings, is a scalar function and will appear only in the context of a streaming function.

Use the example plugins

After registering the example plugins, you can now use the custom functions in DSP.

  1. From the Data Stream Processor UI, click on Build Pipeline.
  2. Select Read from Splunk Firehose as your source function.
  3. Click on the + icon to add a new function, and select Eval.
  4. Use join_strings in Eval:
    HelloWorld=join_strings(["Hello", "World"], " | ")
  5. Validate your pipeline.
  6. Click Start Preview and send some events to your pipeline.

Write your own plugin

The code for a plugin lives in a Gradle module, which is a top-level directory in the SDK. The dsp-plugin-examples is an example Gradle module that you can refer to throughout development.

The dsp-plugin-functions is a skeleton Gradle module provided for you. This tutorial assumes you are working in dsp-plugin-functions.

  1. Create a Java class inside the com.splunk.streaming.user.functions package.
    • Your Java class should implement one of the ScalarFunction, StreamingFunction, or SinkFunction interfaces, depending on which type of function you want to create.
  2. Add a file to the /resources/META-INF/services directory. The file's name should be the fully qualified name of the interface your function implements, for example com.splunk.streaming.flink.streams.core.StreamingFunction.
  3. Add to this file the fully-qualified class name of each plugin function that implements the given interface on a separate line, for example com.splunk.streaming.user.functions.MapExpandFunction.

Follow the instructions in the section above to build your plugin JAR, register it with DSP, and upload it.

Uploading a new plugin version

After uploading a new version of a plugin JAR, any functions in the latest version will immediately be available for use in newly created pipelines or preview sessions. Any existing running pipelines will continue to run using functions from the latest JAR version which was present when the pipeline was activated. You can deactivate and reactivate running pipelines to pick up the latest function changes.

If you upload a new version of a plugin JAR that contains backwards incompatible changes with the previous version (including function name and function signature changes), you may need to edit any existing pipelines that use the changed functions before you can validate and activate them.

Last modified on 28 September, 2020
Manage connections to external data sources   Deserialize and preview data from Amazon Kinesis

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters