Docs » Connect to your cloud service provider » Connect to Google Cloud Platform » Ingest Google Cloud Platform log data

Ingest Google Cloud Platform log data đź”—

To export GCP Cloud Logging data to Splunk Observability Cloud, create a Pub/Sub subscription and use the Pub/Sub to Splunk Dataflow template to create a Dataflow job. The Dataflow job takes messages from the Pub/Sub subscription, converts payloads into Splunk HTTP Event Collector (HEC) event format, and forwards those payloads to Splunk Observability Cloud, where the whole event (JSON payload and its information) is ingested.

To learn more, see Scenarios for exporting Cloud Logging data: Splunk. While this GCP documentation describes both pull- and push-based methods for exporting Google Cloud Logging data, Splunk Observability Cloud supports the push-based method only.

Use the example gcloud command provided in Option A: Stream logs using Pub/Sub to Splunk Dataflow, with the following changes:

  • Change the token in the sample syntax (token=your-splunk-hec-token) to a Splunk Observability Cloud organization access token with ingest permission. For more information about organization access tokens, see Create and manage organization access tokens using Splunk Observability Cloud

  • Change the URL in the sample syntax (url=your-splunk-hec-url) to point to the real-time log data ingest endpoint for Splunk Observability Cloud: https://ingest.{REALM}.signalfx.com (for example, https://ingest.{REALM}.signalfx.com:443).

Note

Any response code that is not 2xx, including throttling, indicates a message delivery failure. If message delivery fails, you can replay unprocessed messages manually by following the instructions in the “Handling delivery failures” section of GCP documentation for deploying production-ready log exports to Splunk using Dataflow.