Splunk integration for Splunk On-Call 🔗
The following guide shows how to integrate Splunk On-Call with action alerts from searches in Splunk Enterprise and Splunk Cloud Platform.
Requirements 🔗
The integration supports the following Splunk versions:
Splunk Enterprise: 9.1, 9.0, 8.2, 8.1, 8.0, 7.3, 7.2, 7.1, 7.0
Splunk Cloud Platform: 9.1, 9.0, 8.2, 8.1, 8.0, 7.3, 7.2, 7.1, 7.0
The following roles and capabilities are required:
For v.1.0.23 or newer
Setup and configuration require:
admin
victorops_admin and admin_all_objects
list_storage_passwords and admin_all_objects
Usage and testing require:
admin
victorops_admin
victorops_user
For v.1.0.18 or lower to setup and configuration require to grant th following:
list_storage_passwords AND admin_all_objects
For on-premises installation, open port 443 for outgoing communication with Splunk On-Call. The full URL used is the following format:
https://alert.victorops.com/integrations/generic/20131114/alert/<your_api_key>/<your_routing_key>
Configuring Splunk Enterprise 🔗
注釈
When updating to a newer version of the app, run the bump command to clear client and server assets that have been cached. See Customization options for more information.
In Splunk On-Call 🔗
From Splunk On-Call, navigate to Integrations, 3rd Party Integrations, Splunk Enterprise, then select Enable Integration. Copy the API key to the clipboard to use in later steps.
In Splunk Enterprise 🔗
In Splunk Base, search for Splunk On-Call. Select Download and accept the license agreements by checking the boxes and selecting Agree to Download.
Start Splunk and open the web UI in a browser. From the top navigation bar, expand the menu and select Manage Apps. Next, select Install app from file.
Choose the Splunk On-Call for Splunk app .tgz file you downloaded earlier, check Upgrade app to ensure your application is updated to the latest version. Next, select Upload then finish the process by restarting Splunk.
Once Splunk has restarted, return to the Manage Apps page and select Launch App next to the Splunk On-Call Incident Management app. Continue the configuration in the Splunk On-Call Incident Response homepage.
Alert API key configuration 🔗
On the Alert API Key Configuration page, paste the API key copied earlier, along with any desired routing key from your Splunk On-Call organization. If the routing key is empty, alerts are routed to your default routing key. You can also access your API key by following the Splunk On-Call Splunk Integration link.
Testing configuration 🔗
After the API is saved you can verify the integration by selecting Test under actions. This test alert isn’t an incident in your org and is logged as an INFO
alert. To find this test alert, look in your timeline instead of the incidents tab. Alternatively, from the Search app in Splunk, type:
| sendalert victorops param.message_type=“INFO”
This sends a test alert directly to your Splunk On-Call timeline. To create an incident, change INFO
to CRITICAL
.
Data API configuration and routing keys 🔗
For versions 1.0.21 and higher you can add your Splunk On-Call API ID and API Key, found in Splunk On-Call under Integrations, API, to retrieve routing keys within Splunk On-Call. If you have yet to generate your API key and ID, activate and generate your orgs key and ID.
After the API Key and API ID are saved, select Retrieve Routing Keys to retrieve the most up to date list of your organizations routing keys.
When creating a Splunk On-Call alert action, a menu of all routing keys within your Splunk On-Call organization appears.
Configure Splunk On-Call alert actions 🔗
The following is an example of new alert based on a search. From a new search, select Save As, then select Alert.
Give the alert a title, description, and permissions as well as a schedule. Under + Add Actions, select Splunk On-Call.
Select the desired message type, and use the state message field to add a brief description of what this particular alert indicates. You can overwrite the default values for entity_id
if desired. If no API key or routing key is selected, alerts are sent to the default values for these fields. Additionally, you can dynamically reference Splunk fields within these assignments using tokens.
Once the specified conditions are met, an alert appears in your Splunk On-Call timeline.
Alert annotations 🔗
In Splunk On-Call, under the Annotations tab in the incident, all Splunk alerts include an alert link that directs you back to the Splunk alert.
To add other incident annotations see Splunk On-Call Alert Rules Engine.
Splunk and Splunk On-Call mapped fields 🔗
The following table shows mapped Splunk and Splunk On-Call fields:
Configure Splunk Cloud Platform 🔗
In Splunk On-Call 🔗
From the Splunk On-Call web portal, navigate to Integrations, 3rd Party Integrations, Splunk Enterprise, then select Enable Integration. Copy the API key to the clipboard to use in later steps.
In Splunk Cloud Platform 🔗
Under Apps, select Find More Apps, then in the search bar type Splunk On-Call:guilabel:. Select Install. Once the app is installed it shows up under Apps.
Open the app to go to the Splunk On-Call Incident Response Home page, which guides you through setting up the account, configuring API keys, and testing alerts. Once your configuration is complete a check next to each configuration step appears.
Alert API key configuration 🔗
On the Alert API Key Configuration page, paste the API key copied earlier, along with any desired routing key from your Splunk On-Call organization. If the routing key is blank, alerts are routed to your default routing key. You can also access your API key by selecting Splunk On-Call Splunk Integration.
Data API configuration and routing keys 🔗
For versions 1.0.21 and higher, you can add your Splunk On-Call API ID and API Key, found in Splunk On-Call under Integrations, API, to retrieve routing keys within Splunk On-Call. If you have yet to generate your API key and ID, activate and generate your orgs key and ID.
Once the API Key and API ID are saved, select Retrieve Routing Keys to retrieve the most up to date list of your organizations routing keys.
When creating a Splunk On-Call alert action, a menu with all routing keys within your Splunk On-Call organization appears.
Test the configuration 🔗
After the API is saved you can verify the integration by selecting Test under actions. This test alert isn’t an incident in your org as it’s logged as an INFO
alert. To find the test alert, look in your timeline instead of the Incidents tab. Alternatively, type the following from the Search app in Splunk:
| sendalert victorops param.message_type=“INFO”
To create an incident, change INFO
to CRITICAL
.
Configuring Splunk On-Call alert actions 🔗
The following is an example of setting up a new alert based on a search. From a new search select Save As, then select Alert.
Give the alert a title, description, and permissions as well as configure the check schedule. Under :guilabel:` + Add Actions`, select Splunk On-Call.
Select the desired message type, and use the state message field to add a brief description of what this particular alert indicates. You can overwrite the default values for entity_id
if desired. If no API key or routing key is selected, alerts are sent to the default values for these fields. Additionally, you can reference Splunk fields within these assignments using tokens.
Once the specified conditions are met, an alert appears in your Splunk On-Call timeline.
Alert annotations 🔗
In Splunk On-Call, under the Annotations tab in the incident, all Splunk alerts include an alert link that directs you back to the Splunk alert.
To add other incident annotations, see Splunk On-Call Alert Rules Engine.
Advanced configuration 🔗
Proxy settings 🔗
A proxy configuration can be activated for the integration by navigating to Configuration. Proxy Configuration.
Alert recovery configuration 🔗
Once the Splunk for Splunk On-Call app is enabled (1.0.18 and higher), the Alert Recovery checkbox is globally set to ON
by default. The alert recovery checkbox can also be configured at the individual alert level for a more granular setting.
In the global recovery configuration, you can configure the polling interval (in seconds) as well as the number of inactive polls before sending a recovery. The following are the global default settings for Alert Recoveries:
At the individual alert level, under the Splunk On-Call Trigger Actions, you can find the Enable Recovery checkbox for the more granular setting. For versions 1.0.25 and higher you can set the Polling Interval as well as Inactive Poll count for each individual alert.
注釈
Alert specific recovery settings must be greater than the global recovery settings.
Dynamically setting the API Key and Routing Key using Search 🔗
From versions 1.0.25 and higher you can set the API key as well as the routing key in the Search.
The following is an example of the format needed for the dynamic values.
<alert search> | eval 'param.api_key'="xxxxxxxxxx" | eval 'param.routing_key'="xxx"
When creating the Splunk On-Call trigger action with dynamically pulled values from your search, select the parameter api_key
as the API Key for the alerts as well as param.routing_key
as the Routing Key for the alert.
Any dynamic keys used in a Search will be added as in key in your Alert API Key Configuration.
Search Head cluster setup 🔗
Before running Splunk for Splunk On-Call with search heads, make sure that there is a deployer as well as at least 3 search heads.
The following are the steps to take when using the Splunk for Splunk On-Call app with search head clusters.
Install the latest version of the Splunk for Splunk On-Call app on the deployer using the UI.
Push out to the search head by running
./bin/splunk apply shcluster-bundle -target `https://sh1:8089 <https://sh1:8089/>`__ -auth username:password
.Configure the Integration API key on one search head.
The Integration API key automatically gets replicated to the other search head nodes.
Test each search head to verify.
Splunk ITSI 🔗
With the Splunk On-Call and Splunk ITSI integration, you can leverage Splunk’s data and log analysis capabilities to correlate multiple incidents into single event groups and easily send alerts into Splunk On-Call. Then, teammates can collaborate in-line with monitoring data inside the Splunk On-Call timeline to speed up incident response and remediation.
To follow this integration guide you need Splunk ITSI 4.0 or higher.
In Splunk On-Call (ITSI) 🔗
From the Splunk On-Call web portal, navigate to Integrations, 3rd Party Integrations, Splunk ITSI, then select Enable Integration. Copy the API key to the clipboard to use in later steps.
In Splunk ITSI Notable Event Aggregation Policies 🔗
Navigate to Configure, Notable Events Aggregation Policies and select the name of the Aggregation Policy you want to alert Splunk On-Call.
In the Action Rules tab, set your trigger conditions then select Splunk On-Call and configure your alert accordingly.
Keep the Alert Entity ID consistent for all Message Types (leave blank for default) across related actions. Splunk On-Call uses this field to identify incidents and correlate subsequent alerts with the original incident. Once configured correctly, ITSI automatically creates a Splunk On-Call incident.
Create a Splunk On-Call Incident 🔗
Navigate to the Action Rules tab for the desired Aggregation Policy. For an action to create an incident in Splunk On-Call, set the conditions to if the following event occurs: severity greater than Normal then select Splunk On-Call and Configure.
The monitoring tool field and message type are the only fields that need to be set. The rest of the fields use default values. The default values are:
Message Type :
CRITICAL
(set this value)Monitoring Tool:
splunk-itsi
(set this value)Alert Entity ID:
$result.itsi_group_id
Alert Entity Display Name:
$result.itsi_group_title
State Message:
$result.itsi_group_title
Routing Key: Default routing key (unless specified otherwise)
This functionality requires the “Data API Keys” and organization name to be set up in the Splunk On-Call for Splunk app.
From ITSI: you will be able to see if there is an associated Incident to the ticket.
From Splunk On-Call: this will allow for easy access back to the ITSI Filtered Episode Review or Overall Episode Review through annotations.
To Resolve a Splunk On-Call Incident 🔗
Within the same Aggregation Policy, navigate to the Action Rules tab. To resolve the episode in ITSI, select Change status to Resolved. To resolve the corresponding incident in Splunk On-Call, set the conditions to if the episode is broken, then Splunk On-Call and select Configure.
Configure the action making sure to select RECOVERY as the message type and ITSI as the monitoring tool, other values are default values. The Alert Entity ID are the same as the initial alert so that Splunk On-Call resolves the corresponding incident if default values are used.
To acknowledge a Splunk On-Call incident manually 🔗
Navigate to Episode Review then select the desired episode, Actions, and select Splunk On-Call.
Configure the action making sure to select ACKNOWLEDGEMENT
as the message type and ITSI as the monitoring tool. Other values are default. The Alert Entity ID is the same as the initial alert so that Splunk On-Call acknowledges the corresponding incident if default values are used.
Splunk SAI 🔗
Splunk SAI allows you to search through depths of log data and monitor the health of your infrastructure and applications. The Splunk On-Call and Splunk Insights for Infrastructure integration allows you to set alerting thresholds on key monitoring metrics and get alerts to the right person at the right time. Through a simple dropdown in the Splunk SAI platform, choose to send alerts directly into Splunk On-Call where your team can collaborate and resolve incidents faster.
In Splunk On-Call (SAI) 🔗
From the Splunk On-Call web portal, navigate to Integrations, 3rd Party Integrations, Splunk Enterprise, then select Enable Integration. Copy the API key to the clipboard to use in later steps.
In Splunk SAI, navigate to Settings, Notifications and paste your API key and a routing key from your Splunk On-Call accountinto the respective fields. Select Save Credentials.
Under the Investigate page, select an entity.
Navigate to the Analysis tab and select an alert graph, select the three dots and then select Create Alert.
From the alert creation, scroll to the bottom of the dialog and select how under what conditions the alert fires. For the notification method select Splunk On-Call. Select Submit.
Splunk Enterprise Security 🔗
Splunk Enterprise Security (ES) enables security teams to use all data to gain organization-wide visibility and security intelligence. Regardless of deployment model—on-premises, in a public or private cloud, SaaS, or any combination of these—Splunk ES can be used for continuous monitoring, incident response, running a security operations center or for providing executives a window into business risk.
In Splunk Enterprise Security App 🔗
In the Splunk Enterprise Security App navigate to the Incident Review. Once in Incident Review, select an incident you want to send to Splunk On-Call and select the menu under Actions. Next, select Run Adaptive Response Action.
A dialog appears allowing you to add Splunk On-Call as a response action.
Once the response action has been dispatched you receive a confirmation.
Troubleshooting 🔗
See the following troubleshooting steps for help. If your problem still persists, send a detailed summary of your issue, when it first occurred, and what version Splunk instance and Splunk On-Call app you are currently running.
Splunk On-Call app was installed but I am not able to configure the app 🔗
Make sure you have the necessary permissions to configure and set up alerts for Splunk On-Call.
There is no option to customize the Alert Actions 🔗
This is because the Splunk On-Call App Alert Action permissions are not set to global. Go to Settings, Alert Actions and make sure Splunk On-Call App is set to Global sharing.
Splunk On-Call Alert Action is not visible 🔗
Sometimes a reset of the Alert Action permission can fix this issue. Go to Settings, Alert Action, Splunk On-Call (Permissions). Next to Display For, check app, save, then reopen permission and select All apps. Check your alert trigger action on an alert to see if the Splunk On-Call Alert Action is now visible.
Routing key retrieval is failing 🔗
This can sometimes be caused by a firewall or multiple firewalls. To check to see if it is an internal network issue you can run the following cURL command:
curl -X POST
https://alert.victorops.com/integrations/generic/20131114/alert/SPLUNK_API_KEY –insecure -H”accept: application/json” -H “Content-Type:
application/json” -d ‘{“message_type”: “INFO”, “monitoring_tool”:
“splunk”, “state_message”: “Test Alert”, “entity_display_name”: “Test
Alert”}'\`
If the command does not make it to Splunk On-Call, grep for sendalert in the $SPLUNK_HOME/var/log/splunk/victorops_modalert.log
and send the output to Splunk support alongside a detailed summary of the issue you are facing.
Splunk On-Call app is not visible as an alert action for an alert 🔗
Run *./splunk btool check –debug
and send the log and a detailed summary of the issue you are facing to Splunk support.
Splunk (enterprise) alerts stopped alerting in Splunk On-Call 🔗
Run the following command to check for any internal network issues. If the post makes it to Splunk On-Call, check your firewalls.
curl -X POST
“https://alert.victorops.com/integrations/generic/20131114/alert/SPLUNK_API_KEY
–insecure -H”accept: application/json” -H “Content-Type:
application/json” -d ‘{“message_type”: “INFO”, “monitoring_tool”:
“splunk”, “state_message”: “Test Alert from localhost”,
“entity_display_name”: “Test Alert”}'\`
If the post does not make it to Splunk On-Call, grep for sendalert in the $SPLUNK_HOME/var/log/splunk/victorops_modalert.log
and send the output and a detailed summary of the issue you are facing to Splunk support.
Integrating with ITSI Version 4.0 or lower 🔗
Part of the integration relies on system macros not included with older versions of ITSI. To alleviate the issue, you can create the macros by navigating to Settings, Advanced Settings, Search Macros within Splunk. Make sure the following macros exist:
Macro |
Definition |
---|---|
|
|
|
|
|
|