Send data to Amazon Kinesis Data Streams
Send data to Amazon Kinesis Data Streams. Optionally, you can specify the Kinesis partition key for each record. If you do not provide a partition key, a hash of the payload determines the partition key. If you are using this function on an on-premises environment of DSP, you need the DSP Universal license to use this function.
This is a connector-based function. To use it, you must first create a Kinesis connection. See Create a DSP connection to Amazon Kinesis Data Streams. Use that
connection_id as an argument for this function.
Function input schema
Accepts records with any specific schema, but records must have the
body field serialized as bytes.
- Syntax: string
- Description: The ID of the Kinesis connection you must create before using this function.
- Example: "conx-2b39464e-0924"
- Syntax: string
- Description: The name of the stream you want to write to Kinesis.
- Example: "my-stream-123"
- Syntax: expression<bytes>
- Description: The JSON body you want to write.
- Example: json-body
- Syntax: expression<string>
- Description: Your Kinesis partition key. See the AWS documentation about partition keys. Defaults to null.
- Example: partition-key
You can write the function by providing the arguments in this exact order.
...| into kinesis("879837b0-cabf-4bc2-8589-fcc4dad753e7", "my-stream-123", to_bytes("body"));
Alternatively, you can use named arguments to declare the arguments in any order and leave out optional arguments you don't want to declare. All unprovided arguments use their default values. See SPL2 syntax for more details. The following example provides the arguments in an arbitrary order.
...| into kinesis(stream_name: "my-stream-123", connection_id: "879837b0-cabf-4bc2-8589-fcc4dad753e7", body: to_bytes("body"));
Send data to a Splunk index (Default for Environment)
Send data to Amazon S3
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0