Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

On April 3, 2023, Splunk Data Stream Processor reached its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.

All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.

Connecting Amazon Kinesis Data Streams to your DSP pipeline as a data source

When creating a data pipeline in the , you can connect to Amazon Kinesis Data Streams and use it as a data source. You can get data from a Kinesis data stream into a pipeline, transform the data as needed, and then send the transformed data out from the pipeline to a destination of your choosing.

To connect to Kinesis as a data source, you must complete the following tasks:

  1. Create a connection that allows DSP to access your Kinesis data. See Create a DSP connection to Amazon Kinesis Data Streams.
  2. Create a pipeline that starts with the Amazon Kinesis Data Stream source function. See the Building a pipeline chapter in the Use the Data Stream Processor manual for instructions on how to build a data pipeline.
  3. Configure the Amazon Kinesis Data Stream source function to use your Kinesis connection. See Get data from Amazon Kinesis Data Stream in the Function Reference manual.
  4. (Optional) To verify that you've configured the connection and source function correctly, start a pipeline preview and confirm that your Kinesis data appears in the Preview Results pane as expected. Notice that the payloads of your Kinesis records are stored in a field named value, which is a bytes field.
  5. Amazon Kinesis Data Streams always encodes data using Base64 before transporting it. DSP automatically decodes the incoming data from Kinesis, so you don't need to include pipeline functions for decoding the data.

  6. (Optional) Convert the value field from bytes to a more commonly supported data type such as string. This conversion makes the field compatible with a wider range of streaming functions. To convert your data, start by adding an Eval function to your pipeline. Place this function either immediately after the Amazon Kinesis Data Stream source function or after the Where function if you're using one to filter the incoming data. Then, configure the Eval function to use the appropriate conversion scalar function.
    The specific scalar function that you need to use varies depending on the format of your Kinesis payload. In most cases, you can use one of the following expressions in your Eval function:
    • To convert the Kinesis payload from bytes to a string: value=tostring(value)
    • To convert the Kinesis payload from bytes to a map of key-value pairs: value=deserialize_json_object(value)

    See Eval and Conversion in the Function Reference manual for more information about these functions.

  7. If you're planning to send the Kinesis data to a Splunk index, make sure to format the records so that they can be indexed meaningfully. See Formatting data from Amazon Kinesis Data Streams for indexing in the Splunk platform.

When you activate the pipeline, the Amazon Kinesis Data Stream source function starts collecting data from Kinesis.

Last modified on 25 March, 2022
Connecting multiple data destinations to your DSP pipeline   Connecting Amazon Kinesis Data Streams to your DSP pipeline as a data destination

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5, 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters