Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor reached its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.

All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.
Acrobat logo Download topic as PDF

Create a DSP connection to send data to Google Cloud Storage

To send data from a data pipeline in the to a Google Cloud Storage bucket, you must first create a connection using the Google Cloud Storage Connector. You can then use the connection in the Send to Google Cloud Storage sink function to send data from your pipeline to your Cloud Storage bucket.

Prerequisites

Before you can create the Google Cloud Storage connection, you must have the following:

  • A Google Cloud service account that has the storage.objects.create permission.
  • A JSON file containing the credentials of your Google Cloud service account.

If you don't have the required permission or the credentials file, ask your Google Cloud Platform administrator for assistance. For information about managing permissions and roles, search for "Identity and Access Management" in the Google Cloud storage documentation. For information about creating the credentials file, search for "Creating service account keys" in the Google Cloud Identity and Access Management (IAM) documentation.

Steps

  1. In DSP, select the Connections page.
  2. On the Connections page, click Create Connection.
  3. On the Sink tab, select Google Cloud Storage Connector and then click Next.
  4. Complete the following fields:
    Field Description
    Connection Name A unique name for your Google Cloud Storage connection.
    Description (Optional) A description of your connection.
    Service Account Credentials JSON The JSON file containing the credentials of your Google Cloud service account.

    Any credentials that you upload are transmitted securely by HTTPS, encrypted, and securely stored in a secrets manager.

  5. Click Save.

    If you're editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes. When you reactivate a pipeline, you must select where you want to resume data ingestion. See Using activation checkpoints to activate your pipeline in the Use the Data Stream Processor manual for more information.

You can now use your connection in a Google Cloud Storage sink function at the end of your data pipeline to send data to a Cloud Storage bucket. For instructions on how to build a data pipeline, see the Building a pipeline chapter in the Use the Data Stream Processor manual. For information about the sink function, see Send data to Google Cloud Storage in the Function Reference manual.

Last modified on 25 March, 2022
PREVIOUS
Connecting Google Cloud Storage to your DSP pipeline as a data destination
  NEXT
Connecting Microsoft 365 to your DSP pipeline

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters