Splunk® Data Stream Processor

Use the Data Stream Processor

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.
Acrobat logo Download topic as PDF

Deserialize and send Azure Event Hubs data from a DSP pipeline

Your pipeline can ingest data from Azure Event Hubs and send that data to a Splunk Index or a third-party sink.

Prerequisites

This connector uses a non-epoch receiver to retrieve data from an Azure Event Hubs through a consumer group. An Event Hub consumer group can support up to five concurrent non-epoch receivers. If you are using the same consumer group across multiple pipelines, or if you have other programs connecting using this consumer group, then your pipeline might fail to connect. In addition, if an epoch-based receiver connects through the same consumer group, connections for non-epoch based receivers will be rejected. To avoid the risk of having too many receivers connecting through the same consumer group, or other programs connecting with epoch-based receivers, best practices are to create a dedicated consumer group for each pipeline.

Steps
Once you satisfy the prerequisites, you can ingest data from Azure Event Hubs.

  1. From the Add Data page, select Read from Azure Event Hubs using SAS Key as your source function.
  2. On the next page, complete the following fields:
    Field Description Example
    Connection Id The name of the connection set up as a prerequisite. myEventHubConnection
    Event Hub name The name of the Event Hub entity to subscribe to. operational-logs
    Consumer Group Name The name of the consumer group set up as a prerequisite. dsp-ConsumerGroup
    Starting Position Optional. Specifies where to start reading data from. Defaults to earliest. earliest
  3. Click the + to add a new function.
  4. Select Eval.
  5. Azure Event Hubs transmit data in binary and doesn't follow the DSP event schema. You must convert your data from bytes. Use the Eval function textbox to call to-string to turn your bytes into a string or deserialize-json-object to turn your bytes into a map.
     as(to-string(get("body")), "body");
    as(deserialize-json-object(get("body")), "body");
    
  6. Click the + icon and add any other data transformation functions to your pipeline. See the Function Reference for a full list of available functions.
  7. Select the Write to Splunk Enterprise function.
  8. Select a Connection and an Index from the drop-down list:
    Field Example
    index literal("main");
    parameters Optional. For a list of available parameters, see Write to Splunk Enterprise function in the Function Reference.
  9. Click Validate to confirm your pipeline's functions are correctly configured.
  10. Click Save to save your pipeline, or Activate to activate it.
Last modified on 15 January, 2020
PREVIOUS
Deserialize and preview data from Amazon Kinesis
  NEXT
Aggregate records in a DSP pipeline

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.0.0


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters