Splunk® Data Stream Processor

Function Reference

On April 3, 2023, Splunk Data Stream Processor reached its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.

All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.

Get data from Amazon Kinesis Data Stream

Use the Amazon Kinesis Data Stream source function to get data from Amazon Kinesis Data Streams.

The payload of the ingested data is Base64-encoded. To deserialize and preview your data, see Deserialize data from Amazon Kinesis Data Streams in the Connect to Data Sources and Destinations with the manual.

Prerequisites

Before you can use this function, you must create a connection. See Create a connection to Amazon Kinesis Data Streams in the Connect to Data Sources and Destinations with the manual. When configuring this source function, set the connection_id argument to the ID of that connection.

Function output schema

This function outputs records with the schema described in the following table.

Key Description
key The partition key of the record as a string.
value The payload of the record in bytes.
stream The name of the Amazon Kinesis data stream that the record is coming from, given as a string.
shard The ID of the shard in the Amazon Kinesis data stream that is associated with the record, given as a string.
sequence The sequence number of the record as a string.
approxArrivalTimestamp The date and time when the record entered the Amazon Kinesis data stream, given in epoch time format in milliseconds and stored as a long.
accountId The ID of the Amazon Web Services (AWS) account associated with the record, given as a string.
region The AWS region associated with the record, given as a string.

The following is an example of a typical record from the kinesis function:

{
"key": "837nyj2575uz04a21km379v744zfn232",
"value": "aGVsbG8gd29ybGQ=",
"stream": "my-records",
"shard": "shardId-000000000001",
"sequence": "405926725837525282491387629233518607908229414491194259",
"approxArrivalTimestamp": 1564424209658,
"accountId": "202997163303",
"region": "us-east-2"
}

Required arguments

connection_id
Syntax: string
Description: The ID of your Amazon Kinesis connection.
Example: "576205b3-f6f5-4ab7-8ffc-a4089a95d0c4"
stream_name
Syntax: string
Description: The name of the stream.
Example: "my-stream-name"

Optional arguments

initial_position
Syntax: LATEST | TRIM_HORIZON
Description: The position in the data stream where you want to start reading data. Defaults to LATEST.
  • LATEST: Start reading data from the latest position on the data stream.
  • TRIM_HORIZON: Start reading data from the very beginning of the data stream.
Example: LATEST

SPL2 example

When working in the SPL View, you can write the function by providing the arguments in this exact order.

| from kinesis("my-connection-id", "my-stream-name", "TRIM_HORIZON") |...;

Alternatively, you can use named arguments to declare the arguments in any order. The following example uses named arguments to list the optional argument before the required arguments.

| from kinesis(initial_position: "TRIM_HORIZON", connection_id: "my-connection-id", stream_name: "my-stream-name") |...;

If you want to use a mix of unnamed and named arguments in your functions, you need to list all unnamed arguments in the correct order before providing the named arguments.

Last modified on 14 April, 2021
Get data from Ingest service   Get data from Apache Pulsar

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5, 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters