Jenkins 🔗
Description 🔗
The Splunk Distribution of OpenTelemetry Collector provides this integration as the jenkins
monitor by using the SignalFx Smart Agent Receiver.
Use this integration to collect metrics from Jenkins instances by hitting the following endpoints:
Job metrics with the
../api/json
endpointCodahale or Dropwizard JVM metrics with the
metrics/<MetricsKey>/..
endpoint
Benefits 🔗
After you configure the integration, you can access these features:
View metrics. You can create your own custom dashboards, and most monitors provide built-in dashboards as well. For information about dashboards, see View dashboards in Observability Cloud.
View a data-driven visualization of the physical servers, virtual machines, AWS instances, and other resources in your environment that are visible to Infrastructure Monitoring. For information about navigators, see Splunk Infrastructure Monitoring navigators.
Access the Metric Finder and search for metrics sent by the monitor. For information, see Use the Metric Finder.
Installation 🔗
Follow these steps to deploy this integration:
Deploy the Splunk Distribution of OpenTelemetry Collector to your host or container platform:
Configure the monitor, as described in the Configuration section.
Restart the Splunk Distribution of OpenTelemetry Collector.
Configuration 🔗
This monitor type is available in the Smart Agent Receiver, which is part of the Splunk Distribution of OpenTelemetry Collector. You can use existing Smart Agent monitors as OpenTelemetry Collector metric receivers with the Smart Agent Receiver.
This monitor type requires a properly configured environment on your system in which you’ve installed a functional Smart Agent release bundle. The Collector provides this bundle in the installation paths for x86_64/amd64
.
To activate this monitor type in the Collector, add the following lines to your configuration (YAML) file:
Configuration example 🔗
To activate this monitor in the Splunk Distribution of OpenTelemetry Collector, add the following to your agent configuration:
receivers:
smartagent/jenkins:
type: collectd/jenkins
... # Additional config
To complete the monitor activation, you must also include the smartagent/jenkins
receiver item in a metrics
pipeline. To do this, add the receiver item to the service/pipelines/metrics/receivers
section of your configuration file. For example:
service:
pipelines:
metrics:
receivers: [smartagent/jenkins]
See configuration examples for specific use cases that show how the Splunk Distribution of OpenTelemetry Collector can integrate and complement existing environments.
Configuration settings 🔗
The following table shows the configuration options for this monitor:
Option |
Required |
Type |
Description |
---|---|---|---|
|
no |
|
Path to a python binary that should be used to execute the Python code. If not set, a built-in runtime will be used. Can include arguments to the binary as well. |
|
yes |
|
|
|
yes |
|
|
|
no |
|
|
|
yes |
|
Key required for collecting metrics. The access key located at |
|
no |
|
Whether to enable enhanced metrics (default: |
|
no |
|
Used to enable individual enhanced metrics when |
|
no |
|
User with security access to Jenkins |
|
no |
|
API Token of the user |
|
no |
|
Whether to enable HTTPS. (default: |
|
no |
|
Path to the keyfile |
|
no |
|
Path to the certificate |
|
no |
|
Path to the ca file |
|
no |
|
Skip SSL certificate validation (default: |
Sample YAML configurations 🔗
Sample basic YAML configuration:
monitors:
- type: collectd/jenkins
host: 127.0.0.1
port: 8080
metricsKey: reallylongmetricskey
Sample YAML configuration with specific enhanced metrics included:
monitors:
- type: collectd/jenkins
host: 127.0.0.1
port: 8080
metricsKey: reallylongmetricskey
includeMetrics:
- "vm.daemon.count"
- "vm.terminated.count"
Sample YAML configuration with all enhanced metrics included:
monitors:
- type: collectd/jenkins
host: 127.0.0.1
port: 8080
metricsKey: reallylongmetricskey
enhancedMetrics: true
Metrics 🔗
These metrics are available for this integration.
Get help 🔗
If you are not able to see your data in Splunk Observability Cloud, try these tips:
Submit a case in the Splunk Support Portal
Available to Splunk Observability Cloud customers
-
Available to Splunk Observability Cloud customers
Ask a question and get answers through community support at Splunk Answers
Available to Splunk Observability Cloud customers and free trial users
Join the Splunk #observability user group Slack channel to communicate with customers, partners, and Splunk employees worldwide
Available to Splunk Observability Cloud customers and free trial users
To learn how to join, see Get Started with Splunk Community - Chat groups
To learn about even more support options, see Splunk Customer Success.