Format and send events to a DSP data pipeline using the Ingest REST API
To format and send events using the Splunk Ingest REST API, begin by testing whether you can send an event to an ingest endpoint:
- Open a command prompt window or terminal.
- Type the following SCloud command to test if you can send an event to an ingest endpoint.
./scloud ingest post-events -format raw <<< 'This is a test event.'
Format event data to send to the /events endpoint
Use the Ingest REST API /events endpoint to send event data to your data pipeline over HTTPS in JSON format. The /events endpoint accepts an array of JSON objects. Each JSON object represents a single DSP event.
[ { "body": "Hello, World!", "attributes": { "message": "Something happened" }, "host": "dataserver.example.com", "source": "testapplication", "sourcetype": "txt", "timestamp": 1533671808138, "nanos": 0, "id": "2823738566644596", "kind": "event" }, { "body": "Hello, World2!", "attributes": { "message": "Something happened" }, "host": "dataserver.example.com", "source": "testapplication", "sourcetype": "txt", "timestamp": 1533671808138, "nanos": 0, "id": "2518594268716256", "kind": "event" },..... ]
Event schema
There are eight keys that can be included in the event schema. Including an event with a field not defined in this schema results in an "INVALID_DATA" error.
Key | Required? | Description |
---|---|---|
body | Required | The event's payload. It can be one of the basic types: a string, bytes, or a number (int32, int64, float, or double). In addition, body can be a list or a map of basic types. The default type of body is a union type of all possible types. In order to use downstream functions that have more specific type signatures, users should first cast body to the appropriate more specific type. See cast or ucast for an example. |
attributes | Optional | Specifies a JSON object that contains explicit custom fields. |
host | Optional | The host value to assign to the event data. This is typically the hostname of the client from which you are sending data. |
source | Optional | The source value to assign to the event data. For example, if you are sending data from an app you are developing, you can set this key to the name of the app. |
source_type | Optional | The sourcetype value to assign to the event data. |
timestamp | Optional | The event time in epoch time format in milliseconds. If this key is missing, a timestamp with the current time is assigned when the event reaches the Ingest REST API service. |
nanos | Optional | Nanoseconds part of the timestamp. |
id | Optional | Unique ID of the event. If it is not specified, the system generates an ID. |
kind | Optional | The value event , to indicate that the record is an event.
|
Format metrics data to send to the /metrics endpoint
Use the Ingest REST API /metrics endpoint to send metric event data to your data pipeline over HTTPS in JSON format. The /metrics endpoint accepts a list of JSON metric objects. Each JSON object represents a single DSP metric event.
Payload schema = [<JsonMetricObject>, <JsonMetricObject>, ...] # a list of JsonMetricObject JsonMetricObject = { "body": [<Metric>, <Metric>, ...], "timestamp": int64, "nanos": int32, "source": string, "sourcetype": string, "host": string, "id": string, "kind": string, "attributes": { "defaultDimensions": map[string]string, "defaultType": string, "defaultUnit": string } }
Metrics schema
There are eight keys that can be included in the metrics schema. Including a metric with a field not defined in this schema results in an "INVALID_DATA" error.
Key | Required? | Description |
---|---|---|
body | Required | An array of one or more JSON objects following the defined schema. Each object represents a measurement of a given metric at the time denoted by the parent object's timestamp.
// Metric = { // "name": "cpu.util", // "value": 45.0, // "dimensions": {"Server":"nginx", "Region":"us-west-1"}, // "type": "g", // "unit": "percent"} Metric = map[string] object { "name": string, // required. metric name "value": numeric, // required. double | float | int | long } |
attributes | Optional | JSON objects that follow the defined schema. For example: {"Server":"nginx", "Region":"us-west-1", ...} . If set, individual metrics inherit these dimensions. If there is a dimension also given in the body field of the individual metric, the body field dimension takes precedent.
"attributes": { "defaultDimensions": map[string]string, // optional. String map. // For example: {"Server":"nginx", "Region":"us-west-1", ...}, "defaultType": string, // optional. metric type, by default it is "g" for "gauge" "defaultUnit": string // optional. metric unit, by default it is "none" } |
host | Optional | The host value to assign to the event data. This is typically the host name of the client from which you are sending data. |
source | Optional | The source value to assign to the event data. For example, if you are sending data from an app you are developing, you can set this key to the name of the app. |
source_type | Optional | The source type value to assign to the event data. |
timestamp | Optional | The event time in epoch time format in milliseconds. If this key is missing, a timestamp with the current time is assigned when the event reaches the Ingest REST API service. |
nanos | Optional | Nanoseconds part of the timestamp. |
id | Optional | Unique ID of the event. If it is not specified, the system generates an ID. |
kind | Optional | The value metric , to indicate that the record is a metric event.
|
Metrics example
The following example shows a command that sends CPU and memory utilization metrics to the Ingest REST API.
curl https://<DSP_HOST>:31000/default/ingest/v1beta2/metrics \ -H "Authorization: Bearer <token>" -H "Content-Type: application/json" -X POST \ -d 'curl https://api.playground.scp.splunk.com/blamuitest/ingest/v1beta2/metrics \ -H "Authorization: Bearer IACWIHfRN93BBMS_-d6eyfjxGjWU2wZ2qU-Et8WTriuDXbl_UveI0-5133rxLCJISGK5u_FKX7Bxw5RBX2UKlpAdRabyXrCZEmcqnFzQ3aGn1U9GXSI7d-8lMxTMvbGNuYqzwxvphkScqa45QlDuLWwbOJUWPFFgqVc32LkgAkPvPLQnHY1lAMCkSAQOtGUV6FD65j8pFi_dgERMSe6Zl1j6GedDHgXv3ilc3vRhV72fqo7nfiwVkG_zJzz0qZaryh9emHIWpWJ_bJ517Dzn0hmJTCgQK2ycatFsZdDhNUNsG0Xt3a7JhLsaD590qEeokMitPbMFemtd9G20EM1VoAUjFIL1L-KVo5mKSWLdM-B5HKYZ3GoxVxWqh4R2ir0C_YHiJC8MCX2GjEwyCAiuTIF0E1QMVz122G6YyYgc-0TeP1p-CLKDZFAqEAQCxNmfO4gTmeKIyMQ_f4v4eQshnRgXfQcFn7Y_CK4WsUR1B-XOdVC9MpPnRNTf87HsrvAOecFElvSipPs3IbyC7bluzTrz5PN6Y0v2-4mtRILHFEQm7_-e___iEd3sMkAh9DBXolSQ1bWCqtq-hQAp3CdbuOcXdl64tEMY0wv1jAQ_IKfhnSojWcwlNmot2nX6BUGuctNw7mcZ3ymkvfFMFYJd4zH3YG-RJd1IsONMp-kYW-JkJfk9XAo1SsZ1ctZaG60mHRVJjMaK6xNV2VYD3q0NHz2xTR3H7WVX8U5oto2DnwuqLEkkFgSIXFUBpmIJud3Fi1czyCl6B42EUhJuBLLde7SF0ScMm-ZPf5xIAPBCeVcednpzU7VfpPojroybQE90VZGclpxUo6BYfCvsBI77r9Z0QzFlUeCxzE4d-DroJPnj09KW8JKTwGrljIB8c6L17I-QZ0u5F91WFwUkv4RWQGB3hehwUwqBwPk8cVG1iG3v0wNyUTsgiXdnL6zt82Jeu-VGHhDOenqz6WGHkgr9_leUfpoPpQ0r-Vl2PQZy9WlIe_Gwa_mOnfMDW38tCuLvc7o-fPa22OetVZUWpxF-xpbV8KEy58IBt5cAb3C3wu_e0HY64Kc9mBvRUD5VPWURiPl8P8MMYw-bB2KPApw4qClYR5oqNU5edNsQtyxHfXdhO7KnXE-902PvFH02d_OE3ftHq_nWZiHw_iNpCp1nB7Ro3VBtd8Z5ufa35ftwIyZ06XlnUFxQ4cA8A_w5ni2IixmVTzVYklOKwz7_BuyY2_ak0wahKzefYYaBuqFjx54geAlsGggw923H8s3ImJbFjAziRezeLiMiE8M" -H "Content-Type: application/json" -X POST \ -d '[ { "body": [ { "name": "cpu.util", "value": 45.0, "unit": "percent" }, { "name": "mem.util", "value": 20, "unit": "gigabytes" }, { "name": "net.in", "value": 3000, "unit": "bytes/second" } ], "sourcetype": "aws:cloudwatch", "timestamp": 1526627123013, "attributes": { "defaultDimensions": { "InstanceId": "i-065d598370ac25b90", "Region": "us-west-1" }, "defaultType": "g" } }], [{ "body": [ { "name": "cpu.util", "value": 49.0, "dimensions" : { "InstanceId": "i-428f599604ba25f91" } }, { "name": "mem.util", "value": 22, }, { "name": "net.in", "value": 4000, } ], "sourcetype": "aws:cloudwatch", "timestamp": 1526627123013, "attributes": { "defaultDimensions": { "InstanceId": "i-065d598370ac25b91", "Region": "us-west-1" }, "defaultType": "g" } } ]'
Getting data in overview for the Splunk Data Stream Processor | Send events to a DSP data pipeline using a Splunk forwarder |
This documentation applies to the following versions of Splunk® Data Stream Processor: 1.0.0, 1.0.0
Feedback submitted, thanks!