Docs » Get started with the Splunk Distribution of the OpenTelemetry Collector » Splunk Add-On for the OpenTelemetry Collector » Install the Technical Add-on for the Splunk OpenTelemetry Collector

Install the Technical Add-on for the Splunk OpenTelemetry Collector ๐Ÿ”—

You can install the Splunk Add-on for the OpenTelemetry Collector to a single or to multiple universal forwarder instances.

The following applies:

  • The Splunk Add-on for the OpenTelemetry Collector installer comes packaged with a number of agent bundle libraries. Do not delete these files, even if you do not plan on incorporating an agent bundle.

  • To save space you can remove the folder for the binaries you are not using. For example, if you are installing on Linux you can delete the Windows folder.

  • If your configuration uses more than one collector, see Manage multiple Collectors.

Install the Splunk Add-on for the OpenTelemetry Collector to a universal forwarder instance ๐Ÿ”—

Follow these steps to install the Splunk Add-on for OpenTelemetry Collector to a universal forwarder instance:

  1. Download and unzip the installation file to the machine running universal forwarder. In your unzipped folder, locate and unzip the .tar file to SPLUNK_HOME > etc > deployment apps.

  2. Create a new โ€œlocalโ€ folder in Splunk_TA_otel/. Open the configuration folder and copy the access_token file into your new local folder.

  3. In the default folder find the inputs.conf file. You can inspect the defaults for the settings and update the values if necessary. Note that the values in inputs.conf must match those in Splunk Web.

  • splunk_config. $SPLUNK_OTEL_TA_HOME/configs/ta-agent-config.yaml by default.

  • disabled. false by default.

  • start_by_shell. false by default.

  • splunk_access_token_file. access_token by default.

  • splunk_realm. us0 by default. A realm is a self-contained deployment that hosts organizations. You can find your realm name on your profile page in the user interface.

  • splunk_trace_ingest_url. The default value is https://ingest.us0.signalfx.com/v2/trace.

  1. In Splunk Observability Cloud, retrieve your access token value. If you do not have a token, contact your Splunk Observability Cloud administrator to create a token. See Create and manage authentication tokens using Splunk Observability Cloud to learn more about tokens.

  2. In Splunk_TA_otel/local, create or open the access_token file, and replace the existing contents with the token value you copied from Splunk Observability Cloud. Save the updated file.

  3. In Splunk Observability Cloud, select your name, then select the Organization tab to verify that the realm value in the realm and sapm-endpoint files in your local folder reflect the value shown in Splunk Observability Cloud. Save any changes you make in the local files.

  4. Restart Splunkd. Your Add-on solution is now deployed.

  5. In Splunk Infrastructure Monitoring, navigate to the host where you deployed the Splunk Add-on for the OpenTelemetry Collector and select it to explore its metrics and status. For more information, see Use navigators in Splunk Infrastructure Monitoring.

Install the Splunk Add-on for the OpenTelemetry Collector to multiple universal forwarder instances using the deployment server ๐Ÿ”—

Follow these steps to install the Splunk Add-on for the OpenTelemetry Collector to multiple universal forwarder instances using the deployment server:

  1. Download and unzip the installation file to the machine running your deployment server. In your unzipped folder, locate and unzip the .tar file to SPLUNK_HOME > etc > deployment apps.

  2. Create a new โ€œlocalโ€ folder in Splunk_TA_otel/. Open the configuration folder and copy the access_token file into your new local folder.

  3. In the default folder find the inputs.conf file. You can inspect the defaults for the settings and update the values if necessary. Note that the values in inputs.conf must match those in Splunk Web.

  • splunk_config. $SPLUNK_OTEL_TA_HOME/configs/ta-agent-config.yaml by default.

  • disabled. false by default.

  • start_by_shell. false by default.

  • splunk_access_token_file. access_token by default.

  • splunk_realm. us0 by default. A realm is a self-contained deployment that hosts organizations. You can find your realm name on your profile page in the user interface.

  • splunk_trace_ingest_url. The default value is https://ingest.us0.signalfx.com/v2/trace.

  1. In Splunk Observability Cloud, retrieve your access token value. If you do not have a token, contact your Splunk Observability Cloud administrator to create a token. See Create and manage authentication tokens using Splunk Observability Cloud to learn more about tokens.

  2. In Splunk_TA_otel/local, create or open the access_token file, and replace the existing contents with the token value you copied from Splunk Observability Cloud. Save the updated file.

  3. In Splunk Observability Cloud, select your name, then select the Organization tab to verify that the realm value in the realm and sapm-endpoint files in your local folder match the value shown in Splunk Observability Cloud. Save any changes you make in the local files.

  4. In Splunk Web, select Settings > Forwarder Management to access your deployment server.

  5. Create a server class:

    1. For โ€œEdit clientsโ€, update Include to add your Universal Forwarder instance and save.

    2. Go to Add apps and select your new Splunk Add-on for the OpenTelemetry Collector service class.

    3. Select Edit on your newly created service class and make sure the following are checked:

      • Enable App

      • Restart Splunkd

  6. Save. Your Add-on solution is now deployed.

  7. In Splunk Infrastructure Monitoring, navigate to the host where you deployed the Splunk Add-on for the OpenTelemetry Collector and select it to explore its metrics and status. For more information, see Use navigators in Splunk Infrastructure Monitoring.

Configure the deployment mode of your Splunk Add-on Collector instance ๐Ÿ”—

The OpenTelemetry Collector has different deployment modes:

  • Host monitoring (agent): This is the default value and the simplest configuration. The Splunk Add-on for the OpenTelemetry Collector, when configured as an agent, sends data to Splunk Observability Cloud.

  • Data forwarding (gateway): When configured as a gateway, your Splunk Add-on for the OpenTelemetry Collector collects data from one or more agents before forwarding it to Splunk Observability Cloud.

  • As an agent that sends data to a gateway: To use a gateway instance, you create one or more instances of Splunk add-on for the OpenTelemetry Collector as agents that send data to that gateway instance.

Deploy the Splunk Add-on for the OpenTelemetry Collector as an agent ๐Ÿ”—

As an agent, the OpenTelemetry Collector sends data directly to Splunk Observability Cloud. This is the default configuration. Learn more at Host monitoring (agent) mode.

If your instance is not configured as an agent and you want to configure it as an agent, edit your inputs.conf file and update the variable Splunk_config to reflect your agent configuration file name. You can find this file in your directory at /otelcol/config/. The default file name is ta-agent-config.yaml. If you are using a custom configuration file, provide that file name.

Deploy the Splunk Add-on for the OpenTelemetry Collector as a gateway ๐Ÿ”—

If deployed as a gateway, the Collector instance can collect data from one or more Collector instances deployed as agents. The gateway instance then sends that data to Splunk Observability Cloud. Learn more at Data forwarding (gateway) mode.

To configure your Splunk Add-on for OpenTelemetry Collector as a gateway:

  1. Edit your inputs.conf file to update the variable Splunk_config with your gateway configuration file name. You can find this file in your directory at /otelcol/config/. The default file name for the gateway file is ta-gateway-config.yaml. If you are using a custom configuration file, provide that file name.

  2. Set the splunk_listen_interface value to 0.0.0.0 or to the specific IP address that sends data to this gateway in local/inputs.conf.

Caution

You must also configure one or more Collector instances as agents that send data to your new gateway.

Configure Splunk Add-on for OpenTelemetry Collector as an agent that sends data to a gateway ๐Ÿ”—

You can set up one or more Collector instances as agents that send data to another instance that is set up as a gateway. See more at Send data from an agent Collector to a gateway Collector.

To do this configure an instance that works as a gateway, and then one or more instances that operate as agents:

  1. Create your gateway, if you have not already done so. See Deploy the Splunk Add-on for the OpenTelemetry Collector as a gateway for more information.

  2. Edit your inputs.conf file to update the variable Splunk_config to reflect your gateway configuration file name. You can find the default configuration file in your directory at /otelcol/config/. The default file name for this configuration file is ta-agent-to-gateway-config.yaml. If you are using a custom configuration file, provide that file name.

  3. In the README directory, open inputs.conf.spec and copy the attribute for the splunk_gateway_url.

  4. Paste this value into ta-agent-to-gateway-config.yaml and then update the value for this setting with the gateway IP address.

This page was last updated on Oct 02, 2024.