Deserialize and send Azure Event Hubs data from a DSP pipeline
Your pipeline can ingest data from Azure Event Hubs and send that data to a Splunk Index or a third-party sink.
Prerequisites
- A connection to read from Azure Event Hub's namespace. See the Azure Event Hub Connector in the Getting data in manual.
- A standalone Event Hub consumer group per pipeline, see Create or update consumer groups.
This connector uses a non-epoch receiver to retrieve data from an Azure Event Hubs through a consumer group. An Event Hub consumer group can support up to five concurrent non-epoch receivers. If you are using the same consumer group across multiple pipelines, or if you have other programs connecting using this consumer group, then your pipeline can fail to connect. In addition, if an epoch-based receiver connects through the same consumer group, connections for non-epoch based receivers will be rejected. To avoid the risk of having too many receivers connecting through the same consumer group, or other programs connecting with epoch-based receivers, best practices are to create a dedicated consumer group for each pipeline.
Create a pipeline
Once you satisfy the previous prerequisites, you can ingest data from Azure Event Hubs.
- From the Add Data page, select Read from Azure Event Hubs using SAS Key as your source function.
- On the next page, complete the following fields:
Field Description Example Connection Id The name of the connection set up as a prerequisite. myEventHubConnection Event Hub name The name of the Event Hub entity to subscribe to. operational-logs Consumer Group Name The name of the consumer group set up as a prerequisite. dsp-ConsumerGroup Starting Position Optional. Specifies where to start reading data from. Defaults to earliest. earliest - Click the + to add a new function.
- Select Eval.
- Azure Event Hubs transmit data in binary and does not follow the DSP event schema. You must convert your data from bytes. Use the Eval function's textbox to call
to-string
to turn your bytes into a string ordeserialize-json-object
to turn your bytes into a map.as(to-string(get("body")), "body");
as(deserialize-json-object(get("body")), "body");
- Click the + icon and add any other data transformation functions to your pipeline. See the function reference for a full list of available functions.
- Click the + icon to add a new function to your pipeline.
- Select the
Write to Splunk Enterprise
function. - Select a Connection and an Index from the drop-down menus:
Field Example index literal("main"); parameters Optional. For a list of available parameters, see the Write to Splunk Enterprise function reference. - Click Validate to confirm your pipeline's functions are correctly configured.
- Click Save to save your pipeline, or Activate to activate it.
Deserialize and preview data from Amazon Kinesis | Create a Splunk DSP pipeline that processes universal forwarder data |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.0.1
Feedback submitted, thanks!