Splunk Cloud Platform

Use Ingest Processors

Acrobat logo Download manual as PDF


Acrobat logo Download topic as PDF

Ingest Processor is currently released as a preview only and is not officially supported. See Splunk General Terms for more information. For any questions on this preview, please reach out to ingestprocessor@splunk.com.

Generate logs into metrics using Ingest Processor

You can create a pipeline that generates logs from your data into metrics. Generating logs into metrics lets you transform information from your data into a more visible way and configure further data processing based on those logs. You can then send the converted subset of your data to supported destinations, including Splunk Cloud indexers and Splunk Observability Cloud.

Configuring a pipeline to generate logs from metrics involves doing the following:

  • Specifying the partition of the incoming data that the pipeline receives and a destination that the pipeline sends data to. See Partitions for more information.
  • Defining the metrics generation fields by including a thru command in the SPL2 statement of the pipeline. See thru command overview in the SPL2 Search Reference for more information.

Prerequisites

Before creating a pipeline, confirm the following:

Steps

Perform the following steps to create a pipeline that converts logs to metrics:

  1. From the home page on Splunk Cloud Platform, navigate to the Pipelines page and select New pipeline, then Ingest Processor pipeline.
  2. On the Get started page, select Blank pipeline, then Next.
  3. On the Define your pipeline's partition page, perform any applicable definitions to the subset of data that you want this pipeline to process.
    1. Select how you want to partition your incoming data that you want to send to your pipeline. You can partition by source type, source, and host.
    2. Enter the conditions for your partition, including the operator and the value. Your pipeline will receive and process the incoming data that meets these conditions.
    3. Select Next to confirm the pipeline partition.
  4. On the Add sample data page, do the following:
    1. Enter or upload sample data to generate a preview of how your pipeline processes data. The sample data must contain accurate examples of the values that you want to generate into metrics.
    2. Select Next to confirm the sample data.
  5. On the Select a metrics destination page, select the name of the destination that you want to send your metrics to.
  6. On the Select destination dataset page, select the name of the destination that you want to send logs to, then do the following:
    1. If you selected a Splunk platform S2S or Splunk platform HEC destination, select Next.
    2. If you selected another type of destination, select Done and skip the next step.
  7. (Optional) If you're sending data to a Splunk platform deployment, you can specify a target index:
    1. In the Index name field, select the name of the index that you want to send your data to.
    2. (Optional) In some cases, incoming data already specifies a target index. If you want your Index name selection to override previous target index settings, then select the Overwrite previously specified target index check box.
    3. Select Done.
    4. If you're sending data to a Splunk platform deployment, be aware that the destination index is determined by a precedence order of configurations.

  8. (Optional) To generate a preview of how your pipeline processes data based on the sample data that you provided, select the Preview Pipeline icon (Image of the Preview Pipeline icon). Use the preview results to validate your pipeline configuration.
  9. On the SPL2 editor page, add processing commands to your SPL2 statement as needed. You can do this by selecting the plus icon (This image shows an icon of a plus sign.) next to Actions and selecting a data processing action, or by typing SPL2 commands and functions directly in the editor.
  10. Select the plus icon (This image shows an icon of a plus sign.) next to Actions, then select Create metricization rule.
  11. Complete the following fields:
    1. Fill in a name for your metric in the Metric name field.
    2. Choose the type of your metric in Metric Type.
    3. Select the field that contains the value of your metric in Field.
    4. Select the field that contains the timestamp of your metric in Time field.
    5. Select what field(s) you want your metrics to be grouped by in Field dimensions.
    6. Each of these fields is an argument in your SPL2 statement as described below.

      SPL2 Argument Description
      name The name of the metric.
      metrictype Determines how metrics are presented in Splunk Observability Cloud. This argument also affects how the metric is interpreted and displayed in Splunk Observability Cloud. Choose from Gauge, Counter, and Cumulative Counter. See Metric types in the Splunk Observability Cloud documentation for more information.

      count and sum aggregations should always use the counter metric type. average, min, and max aggregations should always use the gauge metric type.

      value The value of the metric.
      time The unix time in epoch seconds of the metric.
      dimensions Zero or more dimensions associated with the metric. If the metric has no dimensions, this argument is optional and can be omitted.


    7. In the Metrics preview panel, select the Rollup function that corresponds to your metric type. Each metric type has a default rollout in Splunk Observability Cloud and your selection in the Ingest Processor must match that default. See See Metric types in the Splunk Observability Cloud documentation for more information on metric types and their corresponding default rollups.
  12. Select Apply to confirm your metrics definitions.
  13. Repeat steps 10-12 to generate multiple metrics in your pipeline if desired. A pipeline that generates multiple metrics will look like the below.
    $pipeline = | from $source | thru [
     | logs_to_metrics name="mymetric" metrictype="metric_type"
    value=metric_value time=_time dimensions={"foo": bar}
     | into $metrics_destination
    ]
    | thru [
     | logs_to_metrics name="mymetric2" metrictype="metric_type2"
    value=metric_value2 time=_time dimensions={"foo": bar} 
     | into $metrics_destination
    ]
    | into $destination;
  14. To save your pipeline, do the following:
    1. Select Save pipeline.
    2. In the Name field, enter a name for your pipeline.
    3. (Optional) In the Description field, enter a description for your pipeline.
    4. Select Save. The pipeline is now listed on the Pipelines page, and you can now apply it, as needed.
  15. To apply this pipeline, do the following:
    1. Navigate to the Pipelines page.
    2. In the row that lists your pipeline, select the Actions icon (Image of the Actions icon), and then select Apply. It can take a few minutes to finish applying your pipeline. During this time, all applied pipelines enter the Pending status.
    3. (Optional) To confirm that Ingest Processor has finished applying your pipeline, navigate to the Ingest Processor page.
Last modified on 26 March, 2024
PREVIOUS
Extract fields from event data using Ingest Processor
  NEXT
Extract JSON fields from data using Ingest Processor

This documentation applies to the following versions of Splunk Cloud Platform: 9.1.2308 (latest FedRAMP release), 9.1.2312


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters