New features for DSP
Here's what's new in each version of the Splunk Data Stream Processor
The following table describes new features or enhancements in DSP 1.1.0.
|New Feature or Enhancement||Description|
|SPL2 Support||DSP now supports creating and configuring DSP pipelines using SPL2 (Search Processing Language version 2). For more information, see SPL2 for DSP.|
|SPL2 Builder||DSP now supports an additional pipeline builder experience allowing you to write pipelines in SPL2. For more information, see SPL2 Pipeline Builder.|
|DSP HTTP Event Collector||You can send events and metrics to a DSP data pipeline using the DSP HTTP Event Collector (DSP HEC). The DSP HEC supports the Splunk HTTP Event Collector (HEC) |
|Syslog support||You can now easily ingest syslog data into DSP using Splunk Connect for Syslog (SC4S). For more information, see Send Syslog events to a DSP data pipeline using SC4S with DSP HEC.|
|Amazon Linux 2 support||DSP now supports Amazon Linux 2. For more information, see Hardware and Software requirements.|
|Upgraded Streams REST API endpoints to v3beta1||See the Splunk Data Stream Processor REST API Reference.|
|Apache Pulsar messaging bus||DSP now uses Apache Pulsar as its messaging bus for data sent via the Ingest, Collect, and Forwarders Services.|
|Splunk Enterprise sink function with Batching||You can now do index-based routing even while batching records. This function performs the common workflow of mapping the DSP event schema to Splunk HEC metrics or events schema, turning records into JSON payloads, and batching the bytes of those payloads for better throughput. See Write to the Splunk platform with Batching.|
|Splunk Enterprise sink function||This function replaces |
|Batch Bytes streaming function||DSP now supports batching your data as byte payloads for increased throughput. For more information, see Batch Bytes.|
|To Splunk JSON streaming function||You can now perform automatic mapping of DSP events schema to Splunk HEC events or metrics schema. For more information, see To Splunk JSON.|
|Write to S3-compatible storage sink function||DSP now supports sending data to an Amazon S3 bucket. For more information, see Write to S3-compatible storage.|
|Write to SignalFx sink function||DSP now supports sending data to a SignalFx Endpoint. For more information, see Write to SignalFx.|
|Microsoft 365 Connector||DSP now supports collecting data from Microsoft 365 and Office 365 services using the Microsoft 365 Connector. For more information, see Use the Microsoft 365 Connector with Splunk DSP.|
|Google Cloud Monitoring Metrics Connector||DSP now supports collecting metrics data from Google Cloud Monitoring. For more information, see Use the Google Cloud Monitoring Metrics Connector with Splunk DSP.|
|Amazon S3 Connector||The Amazon S3 Connector now supports Parquet format as a File Type. For more information, see Use the Amazon S3 Connector with Splunk DSP.|
|Write to Azure Event Hubs Using SAS Key sink function (Beta)||DSP now supports sending data to an Azure Event Hubs namespace using an SAS key. This is a beta function and not ready for production. For more information, see Write to Azure Event Hubs.|
|DSP Plugins SDK||You can now create your own custom functions using the DSP Plugins SDK. See Create custom functions with the DSP SDK.|
|Bug fixes||The Splunk Data Stream Processor 1.1.0 includes several bug fixes. For details, see Fixed Issues for DSP.|
The Streams v3beta1 REST API adds the following new endpoints:
The Ingest v1beta2 REST API adds the following new endpoint:
The SCloud that ships with the DSP installer does not contain the new Streams v3beta1 API or the new Ingest v1beta2 endpoints. If you want to use either of these two services, you will need to upgrade SCloud. See Authenticate with SCloud.
Renamed functions in version 1.1.0
The following functions were renamed in 1.1.0.
|Original function name||Updated function name|
|Batch Events||Batch Records|
Removed features in version 1.1.0
The following features were removed in version 1.1.0.
|Removed in this version||What do I need to know?|
|literal and list scalar functions||The |
|Streams DSL support in the UI||The Data Stream Processor UI no longer supports Streams DSL. Saving a DSL pipeline in the UI automatically triggers a migration from DSL to SPL2.|
|Write Splunk Enterprise sink function||The |
|The connectionless Read from Apache Kafka with SSL and Write to Kafka with SSL source and sink functions||Use the |
- The /streams/v3beta1/pipelines/expand endpoint has been removed.
- The /streams/v3beta1/groups endpoint has been removed.
- The POST /streams/v3beta1/pipelines/merge endpoint has been removed.
- Bug fixes. For details, see Fixed issues.
This is the first release of the Splunk Data Stream Processor.
Known issues for DSP
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.1.0