Sum connector π
The Splunk Distribution of the OpenTelemetry Collector uses the Sum connector to sum attribute values from spans, span events, metrics, data points, and log records.
As a receiver, the supported pipeline types are metrics
, traces
and logs
. As an exporter, the supported pipeline type is metrics
. See Process your data with pipelines for more information.
Note
Values found within an attribute are converted into a float regardless of their original type before being summed and output as a metric value. Non-convertible strings are dropped and not included.
Get started π
Follow these steps to configure and activate the component:
Deploy the Splunk Distribution of the OpenTelemetry Collector to your host or container platform:
Configure the connector as described in the next section.
Restart the Collector.
Sample configuration π
To activate the connector, add sum
to the connectors
section of your configuration file.
For example:
connectors:
sum:
To complete the configuration, add the connector in the service
section of your configuration file according to the pipelines you want to use, for example:
service:
pipelines:
metrics/sum:
receivers: [sum]
traces:
exporters: [sum]
Configuration options π
The following settings are required:
Telemetry type. Nested below the
sum:
connector declaration. Can be any ofspans
orspanevents
fortraces
,datapoints
formetrics
, orlogs
.In Configuration example: Sum attribute values, itβs declared as
spans
.
Metric name. Nested below the telemetry type; this is the metric name the sum connector will output summed values to.
In Configuration example: Sum attribute values, itβs declared as
my.example.metric.name
.
source_attribute
. A specific attribute to search for within the source telemetry being fed to the connector. This attribute is where the connector looks for numerical values to sum into the output metric value.In Configuration example: Sum attribute values, itβs declared as
attribute.with.numerical.value
.
The following settings can be optionally configured:
conditions
. You can use OTTL syntax to provide conditions for processing incoming telemetry. Conditions are ORed together, so if any condition is met the attributeβs value is included in the resulting sum. For more information see OTTL grammar in GitHub.attributes
. Declaration of attributes to include. Any of these attributes found will generate a separate sum for each set of unique combination of attribute values and output as its own datapoint in the metric time series.key
. Required forattributes
. The attribute name to match against.default_value
. Optional forattributes
. A default value for the attribute when no matches are found. Thedefault_value
value can be a string, integer, or float.
Configuration example: Sum attribute values π
This example configuration sums numerical values found within the attribute attribute.with.numerical.value
of any span telemetry routed to the connector and outputs a metric time series with the name my.example.metric.name
with those summed values.
receivers:
foo:
connectors:
sum:
spans:
my.example.metric.name:
source_attribute: attribute.with.numerical.value
exporters:
bar:
service:
pipelines:
metrics/sum:
receivers: [sum]
exporters: [bar]
traces:
receivers: [foo]
exporters: [sum]
Configuration example: Check payment logs π
In this example the Sum connector ingests logs and creates an output metric named checkout.total
with numerical values found in the source_attribute
total.payment
. It also checks any incoming log telemetry for values present in the attribute payment.processor
and creates a datapoint within the metric time series for each unique value.
It also makes sure that:
The attribute
total.payment
is notNULL
.Any logs without values in
payment.processor
are included in a datapoint with thedefault_value
ofunspecified_processor
.
receivers:
foo:
connectors:
sum:
logs:
checkout.total:
source_attribute: total.payment
conditions:
- attributes["total.payment"] != "NULL"
attributes:
- key: payment.processor
default_value: unspecified_processor
exporters:
bar:
service:
pipelines:
metrics/sum:
receivers: [sum]
exporters: [bar]
logs:
receivers: [foo]
exporters: [sum]
Logs to metrics π
For log-to-metrics connection, if your logs contain all values in their body rather than in attributes, use a Transform processor in your pipeline to upsert parsed key/value pairs into attributes attached to the log.
For example, for a JSON payload:
processors:
transform/logs:
log_statements:
- context: log
statements:
- merge_maps(attributes, ParseJSON(body), "upsert")
Troubleshooting π
If you are a Splunk Observability Cloud customer and are not able to see your data in Splunk Observability Cloud, you can get help in the following ways.
Available to Splunk Observability Cloud customers
Submit a case in the Splunk Support Portal .
Contact Splunk Support .
Available to prospective customers and free trial users
Ask a question and get answers through community support at Splunk Answers .
Join the Splunk #observability user group Slack channel to communicate with customers, partners, and Splunk employees worldwide. To join, see Chat groups in the Get Started with Splunk Community manual.