Docs » Configure application receivers » Configure application exporters and receivers for monitoring » Jaeger gRPC

Jaeger gRPC 🔗

Description 🔗

The Splunk Distribution of OpenTelemetry Collector provides this integration as the jaeger-grpc monitor by using the SignalFx Smart Agent Receiver.

Use this monitor to run a gRPC server that listens for Jaeger trace batches and forwards them to Splunk Observability Cloud (or the configured ingest host in the writer section of the agent config).

By default, the server listens on localhost port 14250, but can be configured to anything.

Note: This is a valid Smart Agent monitor, but Smart Agent is deprecated. For details, see the Deprecation Notice. If you are using OpenTelemetry, consider using the native OpenTelemetry Jaeger receiver instead. To learn more, see the Jaeger receiver documentation in GitHub.

Benefits 🔗

After you configure the integration, you can access these features:

  • View metrics using the built-in dashboard. For information about dashboards, see View dashboards in Observability Cloud.

  • View a data-driven visualization of the physical servers, virtual machines, AWS instances, and other resources in your environment that are visible to Infrastructure Monitoring. For information about navigators, see Splunk Infrastructure Monitoring navigators.

  • Access Metric Finder and search for metrics sent by the monitor. For information about Metric Finder, see Use the Metric Finder.

Installation 🔗

Follow these steps to deploy this integration:

  1. Follow the steps to deploy the Splunk Distribution of OpenTelemetry Collector to your host or container platform:

  2. Configure the monitor, as described in the Configuration section.

  3. Restart the Splunk Distribution of OpenTelemetry Collector.

Configuration 🔗

This monitor is available in the Smart Agent Receiver, which is part of the Splunk Distribution of OpenTelemetry Collector. You can use existing Smart Agent monitors as OpenTelemetry Collector metric receivers with the Smart Agent Receiver.

This monitor requires a properly configured environment on your system, in which you’ve installed a functional Smart Agent release bundle. The Splunk Distribution of OpenTelemetry Collector provides this bundle in the installation paths for x86_64/amd64.

To activate this monitor in the Splunk Distribution of OpenTelemetry Collector, add the following lines to your configuration (YAML) file. See https://github.com/signalfx/splunk-otel-collector/tree/main/cmd/otelcol/config/collector on GitHub for the configuration (YAML) files.

Configuration example 🔗

To activate this monitor in the Splunk Distribution of OpenTelemetry Collector, add the following to your agent configuration:

receivers:
  smartagent/jaeger-grpc: 
    type: jaeger-grpc
    ... # Additional config

To complete the integration, include this monitor in a metrics pipeline. To do this, add the monitor to the service/pipelines/traces/receivers section of your configuration file.

service:
  pipelines:
    metrics:
      receivers: [smartagent/jaeger-grpc]

See the configuration example on GitHub for specific use cases that show how the Splunk OpenTelemetry Collector can integrate and complement existing environments.

For Prometheus, see the Prometheus Federation Endpoint Example in GitHub for an example of how the Collector works with Splunk Enterprise and an existing Prometheus deployment.

Configuration settings 🔗

The following table shows the configuration options for this monitor:

Option

Required

Type

Description

listenAddress

no

string

The host:port on which to listen for traces. The default value is 0.0.0.0:14250.

tls

no

object (see below)

TLS are optional tls credential settings to configure the GRPC server with

The nested tls config object has the following fields:

Option Required Type Description
certFile no string The cert file to use for tls
keyFile no string The key file to use for tls

Metrics 🔗

The Splunk Distribution of OpenTelemetry Collector does not do any built-in filtering of metrics for this monitor.

Troubleshooting 🔗

If you are not able to see your data in Splunk Observability Cloud, try these tips:

To learn about even more support options, see Splunk Customer Success.