Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

Acrobat logo Download manual as PDF

Acrobat logo Download topic as PDF

Deserialize and preview data from Google Cloud Pub/Sub in DSP

When you use the Google Cloud Pub/Sub source function to ingest data from a Google Cloud Pub/Sub topic, the payloads of the incoming records are stored in a bytes field named body. During data previews, the Splunk Data Stream Processor displays the contents of bytes fields as base64-encoded values. To view the data as human-readable strings during data preview, you must deserialize the data.

Deserializing the body field also makes it usable as input in a wider variety of streaming functions, since most streaming functions do not accept bytes data as input. See the Function Reference manual for information about the data type that each function accepts as input.


To ingest data from Google Cloud Pub/Sub into a DSP pipeline, you must have a connection to a Google Cloud Pub/Sub topic. See Create a DSP connection to Google Cloud Pub/Sub.


  1. In DSP, select the Pipelines page.
  2. On the Pipelines page, click Create Pipeline.
  3. Select Google Cloud Pub/Sub.
  4. Configure the Google Cloud Pub/Sub function to use your Pub/Sub connection and get data from your Pub/Sub topic. See Get data from Google Cloud Pub/Sub in the Function Reference manual.
  5. On the pipeline canvas, click the Connect a processing or a sink function icon (Add function or branch button) and then select Eval from the function picker.
  6. On the View Configurations tab, enter the following SPL2 expression in the Function field:
  7. Click the Start Preview icon (Start Preview button) and click the Eval function on the pipeline canvas to confirm that the data in the body field has been deserialized from bytes into strings.
Last modified on 25 March, 2022
Create a DSP connection to Google Cloud Pub/Sub
Connecting Google Cloud Storage to your DSP pipeline as a data destination

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.3.0, 1.3.1

Was this documentation topic helpful?

You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters