Send alert notifications to a webhook using Splunk Observability Cloud 🔗
You can configure Splunk Observability Cloud to automatically send alert notifications to a webhook when a condition triggers the detector and when a clear condition clears the alert.
Note
To add a webhook as a detector alert recipient, you must have administrator access. To get this access, an existing administrator adds it to your user profile. See Request administrative access for more information.
If your webhook endpoint fails to respond to a detector notification, Observability Cloud retries the notification for up to 24 hours. If your endpoint still doesn’t respond, you don’t receive the notification.
To send Observability Cloud alert notifications to a webhook, complete the following configuration tasks:
Step 1: Create a webhook 🔗
Create a webhook that listens for and receives Observability Cloud alert notification requests.
Your webhook must use a secure (HTTPS) connection and must support Transport Layer Security (TLS) 1.2 or higher.
To help secure your webhook, establish a shared secret string. When you create the webhook notification integration, you enter this string in one of the input fields. Observability Cloud uses the string as part of a cryptographic algorithm that generates a unique message code for your notification. Observability Cloud then inserts the code in the header of the outgoing webhook notification request. When your code receives the request, use the same algorithm, including the shared secret string, to generate a code. If the codes are identical, the the request to your webhook is secure and valid.
To learn more about the shared secret string, the cryptographic algorithm, and the message code, see the Webhook integrations section in the Splunk Observability Cloud Developers Guide.
Your webhook must return a HTTP 200 OK
response code immediately after you receive the request.
If Observability Cloud does not receive a 200 response code within a certain time frame, it retries the request.
Observability Cloud sends the webhook notification request to your webhook using these settings:
HTTP verb:
POST
Media Type:
Content-Type: application/json
HTTP request headers: Include the parameters described in Observability Cloud webhook request body fields.
In Step 2: Create a webhook integration in Observability Cloud, you need to provide the following information about your webhook:
URL: The URL for your webhook endpoint.
Shared secret: See the description at the start of this step.
Header parameters: Array of key-value pairs that you want to pass to your webhook code. Observability Cloud puts this array in the
X-SFX-Signature
parameter in the HTTP request header. Use this array to pass values that are part of the Observability Cloud webhook request body.
Observability Cloud webhook request body fields 🔗
Observability Cloud provides the following parameters in a JSON object in its request body.
For request body examples, see Observability Cloud webhook request examples.
Field |
Format |
Description |
---|---|---|
|
string |
Name of the detector that triggered the alert |
|
string |
URL of the detector, including a parameter that selects the incident that led to the notification. |
|
string |
ID of the detector |
|
string |
(Optional) Description of the detector |
|
string |
URL of the alert preview image |
|
string |
Unique identifier for this alert notification |
|
string |
Unique identifier for the version of the detector that sent the notification |
|
string |
Name of the detector rule that triggered the alert |
|
string |
Severity level of the rule |
|
string |
(Optional) Runbook URL you specified for the detector rule |
|
string |
(Optional) Tip you specified for the detector rule |
|
string |
Notification title you specified for the detector rule |
|
string |
Notification message you specified for the detector rule |
|
string |
Trigger expression for the rule, in SignalFlow format. Includes the metrics, dimensions, functions, and so forth. |
|
string |
(Optional) Clear expression for the rule, in SignalFlow format. Includes the metrics, dimensions, functions, and so forth. |
|
string |
Kept for backwards compatibility. Use |
|
string |
This is the state of the incident. The value is one of the following:
|
|
string |
Time the event occurred, in ISO 8601 format |
|
array |
Map of the inputs involved in this rule. For more information, see inputs array. |
|
integer |
This is the schema version for this event. The value is always |
inputs
array 🔗
The Observability Cloud webhook request includes an inputs
array. Each object in the array has the name of the program variable to which it’s bound.
If it isn’t bound to a program variable, the name uses the pattern _S0
, _S1
, and so forth.
Each input object contains the following elements:
Element |
Description |
---|---|
|
(Optional) A map of the dimensions of the input signal. If the detector condition doesn’t use dimensions, this element is empty. If the input is a static value rather than a comparison against scalar values, the element is not present. |
|
Value of the input when the detector triggered or cleared the alert |
|
(Optional) This is the fragment of the SignalFlow program that represents the input. This element might not be present for some detectors or for static, anonymous inputs. |
Observability Cloud webhook request examples 🔗
This section provides examples of the JSON request body that Observability Cloud can send to a webhook.
Note
Note about realms
A realm is a self-contained deployment of Splunk Observability Cloud in which your organization is hosted. Different realms have different API endpoints. For example, the endpoint for sending data in the us1
realm is https://ingest.us1.signalfx.com
, while the endpoint for sending data in the eu0
realm is
https://ingest.eu0.signalfx.com
.
When you see a placeholder realm name in the documentation, such as <YOUR_REALM>
, replace it with your actual realm name. To find your realm name, open the left navigation menu in Observability Cloud, select , and select your username. The realm name appears in the Organizations section. If you don’t include the realm name when specifying an endpoint, Observability Cloud defaults to the us0
realm.
The following JSON is the request body for a webhook alert notification from a detector that alerts when memory use reaches or exceeds 90% for 10 minutes.
{
"sf_schema": 2,
"detector": "Memory usage detector",
"detectorUrl": "https://app.<REALM>.signalfx.com/#/detector/ABCDEFGHIJK/edit",
"description": "A detector that alerts when memory usage exceeds 90% for 10 minutes",
"incidentId": "BCDEFGHIJKL",
"eventType": "exceedMemoryUse",
"rule": "Running out of memory",
"severity": "Minor",
"description": "Memory has reached 90% of maximum for 10 minutes",
"detectOnCondition": "when(A > 90, '10m')",
"detectOffCondition": "when(A < 90, '15m')",
"status": "ok",
"statusExtended": "ok",
"imageUrl": "https://org.<YOUR_REALM>.signalfx.com/#/chart/abCDefGHij",
"timestamp": "2023-02-08T19:43:30Z",
"inputs": {
"_S1": {
"dimensions": {
"host": "i-346235qa",
"plugin": "o11y-metadata"
},
"value": 96.235234634345,
"fragment": "data('memory.utilization')"
}
}
}
This is the request body for a webhook alert notification from a detector that alerts when host latency is greater than the data center latency and the data center latency is greater than 40 ms.
{
"sf_schema": 2,
"detector": "My detector",
"detectorUrl": "https://app.<REALM>.signalfx.com/#/detector/<id>/edit",
"incidentId": "<id>",
"eventType": "<event-type>",
"rule": "My detector rule",
"severity": "Critical",
"description": "Latency of host myserver is 43.4, over a datacenter-wide latency of 42.9",
"status": "anomalous",
"statusExtended": "anomalous",
"imageUrl": "https://org.<REALM>.signalfx.com/#/chart/abCDefGHij",
"timestamp": "20122-10-25T21:19:38Z",
"detectOnCondition": "when(a > b and b > 40)",
"inputs": {
"a": {
"key": {
"host": "myserver",
"dc": "us-west-1"
},
"value": 43.4,
"fragment": "data('latency').p99(by=['host', 'dc'])"
},
"b": {
"key": {
"dc": "us-west-1"
},
"value": 42.9,
"fragment": "data('latency').p99(by='dc')"
},
"_S2": {
"value": 40,
"fragment": "40"
}
}
}
Step 2: Create a webhook integration in Observability Cloud 🔗
You must be an Observability Cloud administrator to complete this task.
Log in to Splunk Observability Cloud.
Open the Webhook guided setup. Optionally, you can navigate to the guided setup on your own:
In the navigation menu, select
.Select Add Integration.
In the integration filter menu, select All.
In the Search field, search for Webhook, and select it.
Select New Integration to display the configuration options.
Enter a name for this integration. Give your integration a unique and descriptive name. For information about the downstream use of this name, see About naming your integrations.
In the URL field, enter the webhook URL you created in Step 1: Create a webhook.
In the Shared Secret field, enter the shared secret you established in Step 1: Create a webhook.
In the Headers section, enter any header parameters required by the webhook you created in Step 1: Create a webhook.
Select Save.
If Observability Cloud validates the URL, shared secret, and headers you provided for your webhook, a Validated! success message appears. If you see an error message, make sure that the values you entered match the values you defined in Step 1: Create a webhook.
Step 3: Add a webhook integration as a detector alert recipient in Observability Cloud 🔗
To add a webhook integration as a detector alert recipient in Observability Cloud:
Create or edit a detector that you want to configure to send alert notifications using your webhook integration.
For more information about working with detectors, see Create detectors to trigger alerts and Subscribe to alerts using the Detector menu.
In the Alert recipients step, select Add Recipient.
Select Webhook and then select the name of the webhook integration you want to use to send alert notifications. This is the integration name you created in Step 2: Create a webhook integration in Observability Cloud.
Activate and save the detector.
Splunk Observability Cloud sends an alert notification to the webhook when the detector triggers or clears an alert.