Splunk® Supported Add-ons

Splunk Add-on for Google Cloud Platform

Acrobat logo Download manual as PDF


Acrobat logo Download topic as PDF

Configure Google Workspace audit logs for the Splunk Add-on for Google Cloud Platform

Configure the HTTP Event Collector (HEC) to ingest Google Workspace (GWS) audit logs.

To Configure Google GWS audit logs for the Splunk Add-on for Google Cloud Platform, perform the following steps.

  1. Configure, view, and route audit logs for Google Workspace to Google Cloud.
    See the View and manage audit logs for Google Workspace topic in the Google Cloud documentation.
  2. Share data from your Google Workspace account with services in your organization's Google Cloud Platform (GCP) account.
    See the Share data with Google Cloud Platform services topic in the Google Workspace Admin Help documentation.
  3. Export your audit logs to your Google Cloud Pub/Sub.
    See the Configure and manage sinks topic in the Operations Suite manual in the Google Cloud documentation.
  4. Configure Cloud Pub/Sub inputs for Splunk Add-on for Google Cloud Platform.
    See the Configure Cloud Pub/Sub inputs for Splunk Add-on for Google Cloud Platform topic in this manual.

Export your Google Workspace directory user list to the Splunk Add-on for Google Cloud Platform

  1. Download the users list and export it to their pubsub topic. See the Download a list of users topic in the Advanced user management section of the Google Workspace Admin Help documentation.
  2. Configure Cloud Pub/Sub inputs for Splunk Add-on for Google Cloud Platform.
    See the Configure Cloud Pub/Sub inputs for Splunk Add-on for Google Cloud Platform topic in this manual.

For information on Assets and Identity extractions and configurations, see the Collect and extract asset and identity data in Splunk Enterprise Security topic in the Administer Splunk Enterprise Security chapter in the Splunk Enterprise Security manual.

Configure Pub/Sub topics in Google Cloud

Configure Pub/Sub topics in Google Cloud to ingest data into your Splunk platform deployment.

  1. Navigate to the Google Cloud project you've configured to be used for the log aggregation across your organization.
  2. Create the Pub/Sub Topics. Navigate to Pub/Sub in your project and create two topics:
    1. A primary topic to hold messages to be delivered.
    2. A secondary, dead-letter topic, to store undeliverable messages when Dataflow cannot stream to the HTTP Event Collector (HEC). For example, a misconfigured HEC SSL certificate, disabled HEC token, or message processing error by Dataflow.
    • Primary: <topic-name>
    • Secondary: <topic-name>
  3. Create your subscription to query the both topics created in the last step.
    • Enter any name for your subscription
    • Select the Pub/Sub primary topic created in the previous step
    • Leave the rest of the values default or customize to your organization's preference
    • Repeat the same steps for your dead-letter topic
  4. Create an organization-level aggregated log sink. This lets administrators configure one aggregated sink, and capture all logs across an organization, and projects that should be sent to the Pub/Sub topic created above.

    You cannot create aggregated sinks through the Google Cloud Console. They must be configured and managed through either the API or gcloud CLI tool. Only project-level (non-aggregated) sinks show up in Google Cloud Console at this time.

  5. Open a Cloud Shell in the active project
  6. Enter the following in the Cloud Shell to create the aggregated sink:
    gcloud logging sinks create <sample-sink> \
    pubsub.googleapis.com/projects/<sample-project-sink>/topics/topic-name --include-children \
    --organization=[organization_id] \
    --log-filter='logName:"organizations/<unique organization identifier>/logs/cloudaudit.googleapis.com"'
    

    (Optional) If you want to export more than GSuite events, modify the --log-filter to capture any additional logs you want to export.

    See the Google Cloud documentation for more information on creating aggregated log sinks.

  7. Update permissions for the service account that you created in the previous step. Updating permissions on your service account allows the sink service account to publish messages to your previously created Pub/Sub input topics. To update the permissions, copy the entire name and run the following in the Google Cloud Console:
    1. Open a cloud shell in the active project, or use the existing shell.
    2. Enter the following into the shell.
      gcloud pubsub topics add-iam-policy-binding my-logs \
      --member serviceAccount:<service account name from previous step>\
      --role roles/pubsub.publisher
      
    3. (Optional)Validate the service account and permission association with the following command:
      gcloud logging sinks describe kitchen-sink --organization=organization_id
  8. Referencing the logging configurations that you set up in the previous steps, configure the Dataflow template to output the logs to the Splunk HEC.
    1. Navigate to Dataflow and select Create New Job From Template
    2. Populate the following fields:
        • Job name (Any)
        • Preferred Region
        • Cloud Dataflow Template: Cloud Pub/Sub to Splunk
        • Pub/Sub Subscription name created in previous steps
        • HEC token, created in previous steps
        • HEC URL, created in previous steps
        • DLT Topic, created in previous steps
        • Source, The token default source value. The Splunk Software assigns this value to data that doesn't already have a source value set.
        • Sourcetype, The token default sourcetype value. The Splunk software assigns this value to data that doesn't already have a sourcetype value set.
        • Any bucket name. If you have not created a bucket, navigate to Storage and create a new bucket. The syntax for the bucket name is gs://bucketName.
        • Expand Optional Parameters
          • Set Batch size for sending multiple events to Splunk HEC to 2 (can be adjusted later depending on your volume)
          • Set Maximum Number of Parallel Requests to 8 (can be adjusted later depending on your volume)
          • Set Max workers to 2 (can be adjusted later depending on your volume).

        The default is 20 which will incur unnecessary total Persistent Disk cost if not fully utilized.

    3. Enter any additional settings pertinent to your organization.
    4. Run job.
Last modified on 29 January, 2024
PREVIOUS
Configure Resource Metadata inputs for Splunk Add-on for Google Cloud Platform
  NEXT
Configure Compute Engine inputs for Splunk Add-on for Google Cloud Platform

This documentation applies to the following versions of Splunk® Supported Add-ons: released


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters