Splunk® Data Stream Processor

Function Reference

Acrobat logo Download manual as PDF


DSP 1.2.0 is impacted by the CVE-2021-44228 and CVE-2021-45046 security vulnerabilities from Apache Log4j. To fix these vulnerabilities, you must upgrade to DSP 1.2.4. See Upgrade the Splunk Data Stream Processor to 1.2.4 for upgrade instructions.

On October 30, 2022, all 1.2.x versions of the Splunk Data Stream Processor will reach its end of support date. See the Splunk Software Support Policy for details.
Acrobat logo Download topic as PDF

Get data from Microsoft Azure Event Hubs

Use the Microsoft Azure Event Hubs source function to get data from an Azure Event Hubs namespace.

The payload of the ingested data is encoded as bytes. To deserialize and preview your data, see Deserialize and preview data from Microsoft Azure Event Hubs in the Connect to Data Sources and Destinations with the manual.

Prerequisites

Before you can use this function, you must create a connection. See Create a connection to Microsoft Azure Event Hubs in the Connect to Data Sources and Destinations with the manual. When configuring this source function, set the connection_id argument to the ID of that connection.

Function output schema

This function outputs records with the schema described in the following table.

Key Description
partitionKey The partition key of the event as a string.
body The payload of the event in bytes.
partitionId The ID of the partition in the event hub where the event is stored, given as a string.
offset The offset of the event as a string.
sequenceNumber The sequence number of the event as a long.
enqueuedTime The date and time when the event was queued up for delivery to subscribers, given as a long.
properties The user-defined properties associated with the event, given as a map of strings.

The following is an example of a typical record from the event_hubs function:

{
"partitionKey": "1",
"body": "aGVsbG8gd29ybGQ=",
"partitionId": "1",
"offset": "8589944464",
"sequenceNumber": 83,
"enqueuedTime": 1598479296172,
"properties": {
     "MyProperty": "TestVal"
     }
}

Required arguments

connection_id
Syntax: string
Description: The ID of your Azure Event Hubs connection.
Example in Canvas View: my-azure-event-hubs-connection
event_hub_name
Syntax: string
Description: The name of the Event Hub entity to subscribe to.
Example in Canvas View: my-event-hub-name
consumer_group_name
Syntax: string
Description: The name of a consumer group. This must match the consumer group name as defined in Azure Event Hubs. If the consumer group does not exist, the pipeline will fail. Consumer groups are limited to 5 concurrent readers. To avoid reaching this limit, create a new, dedicated consumer group for each pipeline.
Example in Canvas View: my-consumer-group
starting_position
Syntax: LATEST | EARLIEST
Description: The position in the data stream where you want to start reading data. Set this argument to one of the following values:
  • LATEST: Start reading data from the latest position on the data stream.
  • EARLIEST: Start reading data from the very beginning of the data stream.
Example in Canvas View: LATEST

SPL2 example

When working in the SPL View, you can write the function by providing the arguments in this exact order:

| from event_hubs("my-connection-id", "my-event-hub-name", "my-consumer-group", "LATEST") | ...;

Alternatively, you can use named arguments to declare the arguments in any order. The following example uses named arguments to declare the arguments in an arbitrary order:

| from event_hubs(starting_position: "LATEST", event_hub_name: "my-event-hub-name", connection_id: "my-connection-id", consumer_group_name: "my-consumer-group") |...;

If you want to use a mix of unnamed and named arguments in your functions, you must list all unnamed arguments in the correct order before providing the named arguments.

Last modified on 19 April, 2021
PREVIOUS
Get data from Microsoft 365
  NEXT
Get data from Microsoft Azure Monitor

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0, 1.2.1-patch02, 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5, 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters