Data Manager

Troubleshooting Manual

Acrobat logo Download manual as PDF


This documentation does not apply to the most recent version of Data Manager. For documentation on the most recent version, go to the latest release.
Acrobat logo Download topic as PDF

Troubleshoot Google Cloud Platform data ingestion in Data Manager

Troubleshoot Google Cloud Platform (GCP) data ingestion in Data Manager. Troubleshooting tips include, but are not limited to, the following items that can assist throughout the onboarding process.

For more troubleshooting tips, see the Troubleshooting in Data Manager and the Troubleshoot general problems chapters in this manual.

Data management troubleshooting

If your status on the Data Management page is not Success or In Progress, and the status never changes when you click Refresh, you may have to delete the data input and start again.

For information about status messages, see Verify the data input for GCP in Data Manager.

Search for events and logs

Use the following searches to find events and logs. From the Splunk Cloud menu bar, select Apps > Search & Reporting.

If data ingestion is failing but you see no errors in Data Manager, you can check for errors in the GCP logs by running the following in Splunk Web Search.

index=<user selected index> sourcetype="google:gcp:pubsub:*"

Search for GCP events associated with a specific input ID.

index=<user selected index> source=gcp_cloud_logging_<input id>

Recover events from the dead-letter topic for a GCP input

To recover the events from the dead-letter topic perform the following steps.

  1. Identify and fix the cause of your failed messages.

    Before triggering the recovery Dataflow job, the root cause of the delivery failures must be fixed. To identify and fix the root cause of your message failure, perform the steps contained in the Troubleshoot failed messages section of the Deploying production-ready log exports to Splunk using Dataflow topic in the Google Cloud documentation.

  2. Replay your failed messages.

    After fixing the root cause of your failed messages, your GCP admin must trigger a Pub/Sub to Pub/Sub Dataflow job in order to replay the logs that are currently in dead-letter topic. This delivers the events from the dead-letter topic to the original Dataflow job that writes to the Splunk software, which ensures that they can be ingested to your Splunk Cloud deployment.
    To trigger a GCP Dataflow job that is used to deliver the messages to the primary GCP Dataflow job that was set up using Terraform to export logs to your Splunk Cloud platform, follow the steps outlined at the Replay failed messages section in the Deploying production-ready log exports to Splunk using Dataflow topic in the GCP documentation.

    Use the following commands to start a Pub/Sub to Pub/Sub Dataflow job in order to transfer the messages from the your dead-letter topic back to the primary GCP Dataflow job that was set up using Terraform:
      DM_INPUT_ID = "<copy the Data Manager Input ID here>"
      PROJECT_ID = "<copy your GCP Dataflow project ID here>"
      REGION_ID = "<copy your GCP Dataflow region here>"
    
    
    
      DATAFLOW_INPUT_TOPIC="pubsub_topic_logging_${DM_INPUT_ID}"
      DATAFLOW_DEADLETTER_SUB="pubsub_subscription_deadletter_${DM_INPUT_ID}"
    
    
      JOB_NAME=splunk-dataflow-replay-`date +"%Y%m%d-%H%M%S"`
      gcloud dataflow jobs run $JOB_NAME \
           --gcs-location= gs://dataflow-templates/latest/Cloud_PubSub_to_Cloud_PubSub \
           --worker-machine-type=n1-standard-2 \
           --max-workers=1 \
           --region=$REGION_ID \
           --parameters \
    
      inputSubscription=projects/${PROJECT_ID}/subscriptions/${DATAFLOW_DEADLETTER_SUB},\
      outputTopic=projects/${PROJECT_ID}/topics/${DATAFLOW_INPUT_TOPIC}
    
Last modified on 07 September, 2022
PREVIOUS
Troubleshooting Azure Activity Logs data in Data Manager
  NEXT
Troubleshoot GCP permissions

This documentation applies to the following versions of Data Manager: 1.7.0


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters