Splunk® Data Stream Processor

Connect to Data Sources and Destinations with DSP

On April 3, 2023, Splunk Data Stream Processor reached its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.

All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.

Connecting Microsoft 365 to your DSP pipeline

The Microsoft 365 connector is planned for deprecation. See the Release Notes for more information.

When creating a data pipeline in Splunk Data Stream Processor, you can connect to the Office 365 Management Activity API and use it as a data source. The API lets you audit the end-user activity in your Microsoft 365 and Office 365 services by providing logs that track the status, messages, and management activity in each service. You can get log data from the API into a pipeline, transform the data as needed, and then send the transformed data out from the pipeline to a destination of your choosing.

To connect to the Office 365 Management Activity API as a data source, you must complete the following tasks:

  1. Create a connection that allows DSP to access your Microsoft 365 data. See Create a DSP connection to Microsoft 365.
  2. Create a pipeline that starts with the Microsoft 365 source function. See the Building a pipeline chapter in the Use the Data Stream Processor manual for instructions on how to build a data pipeline.
  3. Configure the Microsoft 365 source function to use your Microsoft 365 connection. See Get data from Microsoft 365 in the Function Reference manual.

When you activate the pipeline, the source function starts collecting logs from the Office 365 Management Activity API. Each log is received into the pipeline as a record.

If your data fails to get into DSP, check the connection settings to make sure you have the correct tenant ID, client ID, and client secret for your Microsoft Azure Active Directory (AD) integration application. DSP doesn't run a check to see if you enter valid credentials.

How Microsoft 365 data is collected

The source function collects data according to the job schedule that you specified in the connection settings. See Scheduled data collection jobs for more information, including a list of the limitations that apply to all scheduled data collection jobs.

The following behavior from the Microsoft 365 connector might have an impact on the exact timing and scope of your data collection jobs:

  • The first time a scheduled job runs, the connector collects data from the past 30 minutes. For all following scheduled jobs, the connector collects data according to the schedule that you specified.
  • In some cases, the connector doesn't collect an event until an upwards of 5 days after the event was originally generated. For more information about this delay, search for "Office 365 Management Activity API frequently asked questions" in the Office 365 Management APIs documentation.
  • The connector can send a maximum of 2,000 requests per minute.
  • Each request from the connector is limited to a maximum time period of 24 hours.
Last modified on 29 March, 2022
Create a DSP connection to send data to Google Cloud Storage   Create a DSP connection to Microsoft 365

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.2.1, 1.2.2-patch02, 1.2.4, 1.2.5, 1.3.0, 1.3.1


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters