Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

DSP 1.2.1 is impacted by the CVE-2021-44228 and CVE-2021-45046 security vulnerabilities from Apache Log4j. To fix these vulnerabilities, you must upgrade to DSP 1.2.4. See Upgrade the Splunk Data Stream Processor to 1.2.4 for upgrade instructions.

On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Create a DSP connection to Amazon Kinesis Data Streams

To get data from Amazon Kinesis Data Streams into a data pipeline in the , or send data from a pipeline to a Kinesis data stream, you must first create a connection. You can then use the connection in the Amazon Kinesis Data Stream source function or sink function.

Prerequisites

Before you can create the Kinesis connection, you must have the following:

If you don't have an IAM user with the necessary permissions, ask your Amazon Web Services (AWS) administrator for assistance.

AWS permissions for getting data from Amazon Kinesis Data Streams

Make sure that the IAM user that you are using to authenticate the connection has read permissions for the Kinesis stream. See the following list of permissions:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "kinesis:ListShards",
        "kinesis:GetRecords",
        "kinesis:GetShardIterator",
        "kinesis:DescribeStream"
      ],
      "Resource": "*"
    }
  ]
}

As a best practice for making sure that these permissions are not applied unnecessarily to other Kinesis streams, in the Resource element, specify the Amazon Resource Name (ARN) of the Kinesis stream that you want to read from. For example, the following Resource definition ensures that your specified permissions are applied only to the Kinesis stream named KinesisStreamA:

      "Resource": [
        "arn:aws:kinesis:*:123123123123:stream/KinesisStreamA"
      ]

Search for "Controlling Access to Amazon Kinesis Data Streams Resources Using IAM" in the Amazon Kinesis Data Streams Developer Guide for more information about defining permissions for accessing Amazon Kinesis.

AWS permissions for sending data to Amazon Kinesis Data Streams

Make sure that the IAM user that you are using to authenticate the connection has write permissions for the Kinesis stream. See the following list of permissions:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "kinesis:PutRecord",
        "kinesis:PutRecords",
        "kinesis:DescribeStream"
      ],
      "Resource": "*"
    }
  ]
}

As a best practice for making sure that these permissions are not applied unnecessarily to other Kinesis streams, in the Resource element, specify the Amazon Resource Name (ARN) of the Kinesis stream that you want to write to. For example, the following Resource definition ensures that your specified permissions are applied only to the Kinesis stream named KinesisStreamB:

      "Resource": [
        "arn:aws:kinesis:*:123123123123:stream/KinesisStreamB"
      ]

Search for "Controlling Access to Amazon Kinesis Data Streams Resources Using IAM" in the Amazon Kinesis Data Streams Developer Guide for more information about defining permissions for accessing Amazon Kinesis.

Steps

  1. From the Data Stream processor home page, click Data Management and then select the Connections tab.
  2. Click Create New Connection.
  3. Select Connector for Amazon Kinesis Data Streams and then click Next.
  4. Complete the following fields:
    Field Description
    Name A unique name for your connection.
    Description (Optional) A description of your connection.
    AWS Access Key ID The access key ID for your IAM user.
    AWS Secret Access Key The secret access key for your IAM user.
    AWS Region The AWS region of the Kinesis stream.

    Any credentials that you upload are transmitted securely by HTTPS, encrypted, and securely stored in a secrets manager.

  5. Click Save.

    If you're editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes. When you reactivate a pipeline, you must select where you want to resume data ingestion. See Using activation checkpoints to activate your pipeline in the Use the Data Stream Processor manual for more information.

You can now use your connection in an Amazon Kinesis Data Stream source function at the start of your data pipeline to get data from Kinesis, or in a Send to Amazon Kinesis Data Streams sink function at the end of your pipeline to send data to Kinesis.

If you're planning to send the Kinesis data to a Splunk index, make sure to format the records so that they can be indexed meaningfully. See Formatting data from Amazon Kinesis Data Streams for indexing in the Splunk platform.

Last modified on 26 February, 2022
Connecting Amazon Kinesis Data Streams to your DSP pipeline as a data destination   Formatting data from Amazon Kinesis Data Streams for indexing in the Splunk platform

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters