Splunk® Data Stream Processor

Use the Data Stream Processor

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.
Acrobat logo Download topic as PDF

Send data from Splunk DSP to Amazon S3

You can use a Write to S3-compatible storage connection with a Write to S3-compatible storage sink function to send data from the Splunk Data Stream Processor (DSP) into an Amazon S3 bucket.

You need a Universal license to send data from DSP to Amazon S3. See Licensing for the Splunk Data Stream Processor.

The S3-compatible storage connector can't be used to collect data from S3 buckets. If you want to collect data from Amazon S3, you must use the Amazon S3 connector. See Use the Amazon S3 Connector with Splunk DSP for more information.

You can only write to Amazon S3 buckets. Third-party S3-compatible vendors are not supported.

Prerequisites

The AWS account you use to create the Amazon S3 sink connection must have the following permissions:

  • s3:Get*
  • s3:Delete*
  • s3:Put*
  • s3:ListBucket
  • s3:ListBucketMultipartUploads

If you are using KMS with customer-managed keys to encrypt the files, you must have the following additional permissions:

  • kms:Decrypt
  • kms:GenerateDataKey

Steps

  1. Click the Connections tab.
  2. Click Create new connection.
  3. Choose the S3-compatible storage connector.
  4. Click Next.
  5. Complete the following fields:
    Field Description
    Connection Name The connection name.
    Description A description of your connection.
    AWS Access Key ID Your AWS access key ID.
    AWS Secret Access Key Your AWS secret access key.

    Any credentials that you upload is transmitted securely by HTTPs, encrypted, and securely stored in a secrets manager.

  6. Click Save.
  7. Optional. If you are editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes.

You can now use your Amazon S3 compatible storage connection with the Write to S3-compatible storage sink function to send data to an Amazon S3 bucket.

Last modified on 31 August, 2020
PREVIOUS
Send data from Splunk DSP to Amazon Kinesis
  NEXT
Send data from Splunk DSP to SignalFx

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters