All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.
Create a DSP connection to Amazon Kinesis Data Streams
To get data from Amazon Kinesis Data Streams into a data pipeline in the Splunk Data Stream Processor, you must first create a connection. You can then use the connection in the Amazon Kinesis Data Stream source function to get data from a Kinesis data stream into a DSP pipeline. If you have a Universal license, you can also create a connection for the Send to Amazon Kinesis Data Streams sink function to send data from DSP to a Kinesis data stream. See Licensing for the Splunk Data Stream Processor.
Prerequisites
Before you can create the Kinesis connection, you must have the following:
- An Identity and Access Management (IAM) user with the necessary permissions for reading from or writing to the Kinesis stream. See these sections on this page for more information:
- The access key ID and secret access key for that IAM user.
If you don't have an IAM user with the necessary permissions, ask your Amazon Web Services (AWS) administrator for assistance.
AWS permissions for getting data from Amazon Kinesis Data Streams
Make sure that the IAM user that you are using to authenticate the connection has read permissions for the Kinesis stream. See the following list of permissions:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "kinesis:ListShards", "kinesis:GetRecords", "kinesis:GetShardIterator", "kinesis:DescribeStream" ], "Resource": "*" } ] }
As a best practice for making sure that these permissions are not applied unnecessarily to other Kinesis streams, in the Resource
element, specify the Amazon Resource Name (ARN) of the Kinesis stream that you want to read from. For example, the following Resource
definition ensures that your specified permissions are applied only to the Kinesis stream named KinesisStreamA
:
"Resource": [ "arn:aws:kinesis:*:123123123123:stream/KinesisStreamA" ]
Search for "Controlling Access to Amazon Kinesis Data Streams Resources Using IAM" in the Amazon Kinesis Data Streams Developer Guide for more information about defining permissions for accessing Amazon Kinesis.
AWS permissions for sending data to Amazon Kinesis Data Streams
Make sure that the IAM user that you are using to authenticate the connection has write permissions for the Kinesis stream. See the following list of permissions:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "kinesis:PutRecord", "kinesis:PutRecords", "kinesis:DescribeStream" ], "Resource": "*" } ] }
As a best practice for making sure that these permissions are not applied unnecessarily to other Kinesis streams, in the Resource
element, specify the Amazon Resource Name (ARN) of the Kinesis stream that you want to write to. For example, the following Resource
definition ensures that your specified permissions are applied only to the Kinesis stream named KinesisStreamB
:
"Resource": [ "arn:aws:kinesis:*:123123123123:stream/KinesisStreamB" ]
Search for "Controlling Access to Amazon Kinesis Data Streams Resources Using IAM" in the Amazon Kinesis Data Streams Developer Guide for more information about defining permissions for accessing Amazon Kinesis.
Steps
- In DSP, select the Connections page.
- On the Connections page, click Create Connection.
- Depending on whether you're using Amazon Kinesis Data Streams as a data source or data destination, do one of the following:
- On the Source tab, select Connector for Amazon Kinesis Data Streams Source and then click Next.
- On the Sink tab, select Connector for Amazon Kinesis Data Streams Sink and then click Next.
- Complete the following fields:
Field Description Name A unique name for your connection. Description (Optional) A description of your connection. AWS Access Key ID The access key ID for your IAM user. AWS Secret Access Key The secret access key for your IAM user. AWS Region The AWS region of the Kinesis stream. Any credentials that you upload are transmitted securely by HTTPS, encrypted, and securely stored in a secrets manager.
- Click Save.
If you're editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes. When you reactivate a pipeline, you must select where you want to resume data ingestion. See Using activation checkpoints to activate your pipeline in the Use the Data Stream Processor manual for more information.
You can now use your connection in an Amazon Kinesis Data Stream source function at the start of your data pipeline to get data from Kinesis, or in a Send to Amazon Kinesis Data Streams sink function at the end of your pipeline to send data to Kinesis.
- For instructions on how to build a data pipeline, see the Building a pipeline chapter in the Use the Data Stream Processor manual.
- For information about the source function, see Get data from Amazon Kinesis Data Stream in the Function Reference manual.
- For information about the sink function, see Send data to Amazon Kinesis in the Function Reference manual.
If you're planning to send the Kinesis data to a Splunk index, make sure to format the records so that they can be indexed meaningfully. See Formatting data from Amazon Kinesis Data Streams for indexing in the Splunk platform.
Connecting Amazon Kinesis Data Streams to your pipeline as a data destination | Formatting data from Amazon Kinesis Data Streams for indexing in the Splunk platform |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6
Feedback submitted, thanks!