Splunk® Data Stream Processor

Function Reference

On April 3, 2023, Splunk Data Stream Processor reached its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.

All DSP releases prior to DSP 1.4.0 use Gravity, a Kubernetes orchestrator, which has been announced end-of-life. We have replaced Gravity with an alternative component in DSP 1.4.0. Therefore, we will no longer provide support for versions of DSP prior to DSP 1.4.0 after July 1, 2023. We advise all of our customers to upgrade to DSP 1.4.0 in order to continue to receive full product support from Splunk.

Get data from Kafka

Use the Kafka source function to get data from an Apache or Confluent Kafka topic.

The payload of the ingested data is stored in a bytes field named value. During data previews, Splunk Data Stream Processor (DSP) displays the contents of bytes fields as Base64-encoded values. To deserialize the data so that you can view it as human-readable strings during data preview, see Deserialize and preview data from Kafka in DSP in the Connect to Data Sources and Destinations with DSP manual.

Prerequisites

Before you can use this function, you must create a connection. See the following pages in the Connect to Data Sources and Destinations with DSP manual for more information:

When configuring this source function, set the connection_id argument to the ID of the connection that you created.

Function output schema

This function outputs records with the schema described in the following table.

Key Description
key The key of the record in bytes.
value The payload of the record in bytes.
topic The name of the Kafka topic where the record is stored, given as a string.
partition The number of the partition in the Kafka topic where the record is stored, given as an integer.
offset The offset of the record as a long.

The following is an example of a typical record from the kafka function:

{
"key": "YTE=",
"value": "aGVsbG8gd29ybGQ=",
"topic": "my-kafka-topic",
"partition": 1,
"offset": 248
}

Required arguments

connection_id
Syntax: string
Description: The ID of your Kafka connection.
Example in Canvas View: my-kafka-connection
topic
Syntax: string
Description: The name of the Kafka topic.
Example in Canvas View: my-kafka-topic

Optional arguments

consumer_properties
Syntax: "<name>": "<value>"
Description: The consumer properties by which you want to delimit your data. Defaults to empty.
  • When working in Canvas View, specify the name and value of the property in the fields on either side of the equal sign ( = ), and click Add to specify additional properties.
  • When working in SPL View, specify each property using the format "<name>": "<value>", and separate each property with a comma ( , ). Make sure to enclose the entire argument in braces ( { } ).
For a list of valid consumer properties, see the "Consumer Configs" section in the Apache Kafka documentation.
Example in Canvas View: name = value

SPL2 example

When working in the SPL View, you can write the function by providing the arguments in this exact order.

| from kafka("my-connection-id", "my-topic", {"property1": "value1", "property2": "value2"}) |...;

Alternatively, you can use named arguments to declare the arguments in any order. The following example uses named arguments to list the optional argument before the required arguments.

| from kafka(consumer_properties: {"property1": "value1", "property2": "value2"}, topic: "my-topic", connection_id: "my-connection-id") |...;

If you want to use a mix of unnamed and named arguments in your functions, you must list all unnamed arguments in the correct order before providing the named arguments.

Last modified on 11 March, 2022
Get data from Apache Pulsar   Get data from Google Cloud Monitoring

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters