Send data from Splunk DSP to Amazon S3
You can use a Write to S3-compatible storage connection with a Write to S3-compatible storage sink function to send data from the Splunk Data Stream Processor (DSP) into an Amazon S3 bucket.
You need a Universal license to send data from DSP to Amazon S3. See Licensing for the Splunk Data Stream Processor.
The S3-compatible storage connector can't be used to collect data from S3 buckets. If you want to collect data from Amazon S3, you must use the Amazon S3 connector. See Use the Amazon S3 Connector with Splunk DSP for more information.
You can only write to Amazon S3 buckets. Third-party S3-compatible vendors are not supported.
Prerequisites
The AWS account you use to create the Amazon S3 sink connection must have the following permissions:
- s3:Get*
- s3:Delete*
- s3:Put*
- s3:ListBucket
- s3:ListBucketMultipartUploads
If you are using KMS with customer-managed keys to encrypt the files, you must have the following additional permissions:
- kms:Decrypt
- kms:GenerateDataKey
Steps
- Click the Connections tab.
- Click Create new connection.
- Choose the S3-compatible storage connector.
- Click Next.
- Complete the following fields:
Field Description Connection Name The connection name. Description A description of your connection. AWS Access Key ID Your AWS access key ID. AWS Secret Access Key Your AWS secret access key. Any credentials that you upload is transmitted securely by HTTPs, encrypted, and securely stored in a secrets manager.
- Click Save.
- Optional. If you are editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes.
You can now use your Amazon S3 compatible storage connection with the Write to S3-compatible storage sink function to send data to an Amazon S3 bucket.
Send data from Splunk DSP to Amazon Kinesis | Send data from Splunk DSP to SignalFx |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0
Feedback submitted, thanks!