Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

Acrobat logo Download manual as PDF


This documentation does not apply to the most recent version of DSP. Click here for the latest version.
Acrobat logo Download topic as PDF

Create a DSP connection to Microsoft 365

To get data from the Office 365 Management Activity API into a data pipeline in Splunk Data Stream Processor (DSP), you must first create a connection using the Microsoft 365 connector. In the connection settings, provide the credentials of your Microsoft Azure Active Directory (AD) integration application so that DSP can access your data, and schedule a data collection job to specify how frequently DSP retrieves the data. You can then use the connection in the Microsoft 365 source function to get data from the Office 365 Management Activity API into a DSP pipeline.

Prerequisites

Before you can create the Microsoft 365 connection, you must have the following:

  • An integration application that registers the connector in Microsoft Azure Active Directory (AD), and has the Read activity data for your organization permission assigned to it.
  • The following credentials from the integration application:
    • Tenant ID, which is also known as a directory ID.
    • Client ID, which is also known as an application ID.
    • Client secret, which is also known as a key.

If you don't have this integration application or the credentials, ask your Microsoft 365 administrator for assistance. For information about creating integration applications, search for "Get started with Office 365 Management APIs" in the Office 365 Management APIs documentation.

Steps

  1. From the Data Stream Processor home page, click Data Management and then select the Connections tab.
  2. Click Create New Connection.
  3. Select Microsoft 365 and then click Next.
  4. Complete the following fields:
    Field Description
    Connection Name A unique name for your connection.
    Tenant ID The tenant ID from Azure AD.
    Client ID The client ID from your integration application in Azure AD.
    Client Secret The client secret from your integration application in Azure AD.
    Content Types The types of logs to collect from Microsoft 365 and Office 365 services. Select one or more of the following types:
    • Audit.AzureActiveDirectory: The audit logs for Azure AD.
    • Audit.Exchange: The audit logs for Microsoft Exchange.
    • Audit.SharePoint: The audit logs for Microsoft SharePoint.
    • Audit.General: The general audit logs for Microsoft 365.
    • DLP.All: The data loss protection (DLP) event logs for all services.
    Scheduled This parameter is on by default, indicating that jobs run automatically. Toggle this parameter off to stop the scheduled job from automatically running. Jobs that are currently running aren't affected.
    Schedule The time-based job schedule that determines when the connector executes jobs for collecting data. Select a predefined value or write a custom CRON schedule. All CRON schedules are based on UTC.

    To avoid running long jobs that don't collect any additional data, schedule your jobs to run for 24 hours or less. Each request from the connector to the API is limited to a maximum time period of 24 hours.

    Workers The number of workers you want to use to collect data.

    Any credentials that you upload are transmitted securely by HTTPS, encrypted, and securely stored in a secrets manager.

  5. Click Save.

    If you're editing a connection that's being used by an active pipeline, you must reactivate that pipeline after making your changes.

You can now use your connection in a Microsoft 365 source function at the start of your data pipeline to get data from the Office 365 Management Activity API. For instructions on how to build a data pipeline, see the Building a pipeline chapter in the Use the manual. For information about the source function, see Get data from Microsoft 365 in the Function Reference manual.

Last modified on 05 March, 2021
PREVIOUS
Connecting Microsoft 365 to your DSP pipeline
  NEXT
Connecting Microsoft Azure Event Hubs to your DSP pipeline as a data source

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.0


Was this documentation topic helpful?

You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters