Splunk® Data Stream Processor

Use the Data Stream Processor

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.
Acrobat logo Download topic as PDF

Deserialize and send Azure Event Hubs data from a DSP pipeline

Your pipeline can ingest data from Azure Event Hubs and send that data to a Splunk Index or a third-party sink.

Prerequisites

This connector uses a non-epoch receiver to retrieve data from an Azure Event Hubs through a consumer group. An Event Hub consumer group can support up to five concurrent non-epoch receivers. If you are using the same consumer group across multiple pipelines, or if you have other programs connecting using this consumer group, then your pipeline can fail to connect. In addition, if an epoch-based receiver connects through the same consumer group, connections for non-epoch based receivers will be rejected. To avoid the risk of having too many receivers connecting through the same consumer group, or other programs connecting with epoch-based receivers, best practices are to create a dedicated consumer group for each pipeline.

Create a pipeline

Once you satisfy the previous prerequisites, you can ingest data from Azure Event Hubs.

  1. From the Add Data page, select Read from Azure Event Hubs using SAS Key as your source function.
  2. On the next page, complete the following fields:
    Field Description Example
    Connection Id The name of the connection set up as a prerequisite. myEventHubConnection
    Event Hub name The name of the Event Hub entity to subscribe to. operational-logs
    Consumer Group Name The name of the consumer group set up as a prerequisite. dsp-ConsumerGroup
    Starting Position Optional. Specifies where to start reading data from. Defaults to earliest. earliest
  3. Click the + to add a new function.
  4. Select Eval.
  5. Azure Event Hubs transmit data in binary and does not follow the DSP event schema. You must convert your data from bytes. Use the Eval function's textbox to call deserialize_json_object to turn your bytes into a map.
    body = deserialize_json_object(body)
    
  6. Click the + icon and add any other data transformation functions to your pipeline. See the function reference for a full list of available functions.
  7. Click the + icon to add a new function to your pipeline.
  8. Select the Write to the Splunk platform with Batching function.
  9. Configure the Write to the Splunk platform with Batching function. See Write to the Splunk platform with batching for more detailed information on these fields.
    Field Example
    index null
    default_index "main"
    parameters Optional. For a list of available parameters, see the Write to the Splunk platform with Batching function reference.
  10. Click Validate to confirm your pipeline's functions are correctly configured.
  11. Click Save to save your pipeline, or Activate to activate it.
Last modified on 31 August, 2020
PREVIOUS
Deserialize and preview data from Amazon Kinesis
  NEXT
Create a Splunk DSP pipeline that processes universal forwarder data

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters