Splunk® Data Stream Processor

Getting Data In

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.
Acrobat logo Download topic as PDF

Create a connection for the DSP Amazon Kinesis Connector with static credentials

Use the AWS Kinesis Connector using static credentials to collect data from any Kinesis stream that the tenant has credentials for.

To use a connector, you must create a connection. You can reuse the Kinesis connection for both the Read from Amazon Kinesis Stream source function and the Write to Kinesis sink function.

  1. From the Data Management page, click on the Connections tab.
  2. Click Create New Connection.
  3. Choose the Amazon Kinesis Connector using static credentials connector.
  4. Click Next.
  5. Complete the following fields:
    Field Description
    Name A unique name for your connection.
    Description A description of your connection
    AWS access key ID The AWS access key ID with permissions to read and write to the Kinesis stream. See the Controlling Access to Amazon Kinesis Data Streams Resources Using IAM for more information on AWS IAM policies.
    AWS region The AWS region of the Kinesis stream.
    AWS secret access key The AWS secret access key with permissions to read and write to the Kinesis stream.

    Any credentials that you upload are transmitted securely by HTTPS, encrypted, and securely stored in a secrets manager.

    If your data fails to get into DSP, check the fields again to make sure you have the correct name, AWS access key ID, AWS region, and AWS secret access key for your Kinesis stream. DSP doesn't run a check to see if you enter the valid credentials.

  6. Click Save.

    If you are editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes.

You can now use your connection in a data pipeline.

See also

Data streams from the Amazon Kinesis Connector in a specific schema. For information on what this schema looks like or how to process Amazon Kinesis data in your data pipeline, see Read from Amazon Kinesis Stream in the DSP Function Reference manual and Deserialize and preview data from Amazon Kinesis in the User Manual.

Last modified on 31 August, 2020
PREVIOUS
Get data in with a DSP push-based connector
  NEXT
Create a connection for the DSP Azure Event Hubs Connector with an SAS key

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters