Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

On April 3, 2023, Splunk Data Stream Processor reached its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.

All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.

Create a DSP connection to Microsoft Azure Event Hubs

To get data from Microsoft Azure Event Hubs into a data pipeline in Splunk Data Stream Processor, you must first create a connection. You can then use the connection in the Microsoft Azure Event Hubs source function to get data from an event hub into a DSP pipeline. If you have a Universal license, you can also create a connection for the Send to Microsoft Azure Event Hubs sink function to send data from DSP to an event hub. See Licensing for the Splunk Data Stream Processor in the Install and administer the Data Stream Processor manual.

Send to Microsoft Azure Event Hubs is a beta function and not ready for production.

Prerequisites

Before you can create an Azure Event Hubs connection, you must have the following:

  • The shared access signature (SAS) key and name for accessing your Azure Event Hubs resources. If you don't have these credentials, ask your Azure Event Hubs administrator for assistance.
  • Outbound ports 5671 and 5672 open in the firewalls of all the nodes in the DSP cluster. If these outbound ports are not open, DSP cannot communicate with Azure Event Hubs. Ask your DSP administrator to confirm the port configurations and open the ports if needed.

Steps

  1. In DSP, select the Connections page.
  2. On the Connections page, click Create Connection.
  3. Depending on whether you're using Azure Event Hubs as a data source or data destination, do one of the following:
    • On the Source tab, select Connector for Microsoft Azure Event Hubs Source and then click Next.
    • On the Sink tab, select Connector for Microsoft Azure Event Hubs Sink and then click Next.
  4. Complete the following fields:
    Field Description
    Name A unique name for your Azure Event Hubs connection.
    Description (Optional) A description of your your Azure Event Hubs connection.
    Namespace name The name of the Azure Event Hubs namespace used to read and write events
    SAS name The name of the Shared Access Signature (SAS) key used to authenticate to Azure Event Hubs.
    SAS key The SAS key used to authenticate to Azure Event Hubs. Your shared access key must have listening privileges for the namespace.

    Any credentials that you upload are transmitted securely by HTTPS, encrypted, and securely stored in a secrets manager.

  5. Click Save.

    If you're editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes. When you reactivate a pipeline, you must select where you want to resume data ingestion. See Using activation checkpoints to activate your pipeline in the Use the Data Stream Processor manual for more information.

You can now use your connection in a Microsoft Azure Event Hubs source function at the start of your data pipeline to get data from Azure Event Hubs, or in a Send to Microsoft Azure Event Hubs sink function at the end of your pipeline to send data to Azure Event Hubs.

Last modified on 25 March, 2022
Connecting Microsoft Azure Event Hubs to your pipeline as a data destination (Beta)   Deserialize and preview data from Microsoft Azure Event Hubs in

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters