Splunk Cloud Platform

Use Ingest Processors

Extract timestamps from event data using Ingest Processor

You can use a pipeline to extract timestamp fields and also convert those timestamps into specific formats. Extracting the Timestamp field lets you visualize events by time and convert timestamps into the appropriate format before sending it to a destination.

For example, if you have data that contains timestamps with multiple formats, you can convert the timestamp information from your data into a specific format directly from the pipeline editor. This is helpful when certain destinations require you to store timestamps using a particular format.

To convert a timestamp, you must write a SPL2 pipeline statement for extracting timestamps from your data and then convert your timestamp to normalize the data from those fields.

  1. On the Pipelines page, select New pipeline, and then select Ingest Processor pipeline. Follow the on-screen instructions to define a partition, enter sample data, and select a data destination.
    After you complete the on-screen instructions, the pipeline builder displays the SPL2 statement for your pipeline.
  2. To generate a preview of how your pipeline processes data based on the sample data that you provided, select the Preview Pipeline icon (Image of the Preview Pipeline icon).
  3. To extract date and time information from the raw data and store it in an event field, do the following:
    1. Select the plus icon (This image shows an icon of a plus sign.) from the Actions section, then select Extract fields from _raw.
    2. In the Regular expression field, specify one or more named capture groups using Regular Expression 2 (RE2) syntax.
      The name of the capture group determines the name of the extracted field, and the matched values determine the values of the extracted field. You can select named capture groups from the Insert from library list or enter named capture groups directly in the field.
      For example, if your raw data contains date and time information formatted like 2023-01-13 04:41:00, you can extract that information into an event field named Timestamp_ISO8601 by selecting Timestamp_ISO8601 from the Insert from library list. The resulting regular expression looks like this:
    3. (?P<Timestamp_ISO8601>(?:\d\d){1,2}-(?:0?[1-9]|1[0-2])-(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])[T ](?:2[0123]|[01]?[0-9]):?(?:[0-5][0-9])(?::?(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?))?(?:Z|[+-](?:2[0123]|[01]?[0-9])(?::?(?:[0-5][0-9])))?)
      

      For more information about the supported regular expression syntax, see Regular expression syntax for Ingest Processor pipelines.

    4. (Optional) By default, the regular expression matches are case sensitive. To make the matches case insensitive, uncheck the Match case check box.
    5. Use the Events preview pane to validate the results of your regular expression. The events in this pane are based on the last time that you generated a pipeline preview, and the pane highlights the values that match your regular expression for extraction.
    6. When you are satisfied with the events highlighted in the Events preview pane, select Apply to perform the timestamp extraction.

    A rex command is added to the pipeline, and the new field appears in the Fields list.

  4. To convert the extracted timestamp to UNIX time and store it in a field named _time, do the following:
    1. In the preview results panel, hover over the header of the field that you extracted during the previous step, and then select the Options for "<field_name>" icon (Image of the Options icon). Then, select Convert _time from <field_name>.
    2. In the Source timestamp format field, specify the current format of the timestamp using SPL2 time variables. See Using time variables in the SPL2 Search Manual for more information.
      For example, if your extracted timestamps are formatted like 2023-01-13 04:41:00, then set Source timestamp format to %Y-%m-%d %H:%M:%S. To include subseconds, use the format %Y-%m-%dT%H:%M:%S.%Q%Z.
    3. Select Apply.
  5. An eval command containing the strptime() function is added to the pipeline, and a new field named _time is added to your events. This field stores the timestamp UNIX format, but continues to display the timestamp in a human readable format. This ensures an accurate timestamp value even if you access the data from a different time zone.

  6. (Optional) After creating the _time field, you can remove the field that you extracted during step 3. To do this, add the following fields command to your SPL2 pipeline statement. Make sure to place this command after your field extraction and conversion expressions.
    | fields - <field_name>
    

    For example, if the name of the field that you extracted during step 3 is Timestamp_ISO8601, then add the following command to your pipeline:

    | fields - Timestamp_ISO8601
    
  7. Save your pipeline and then apply it to the Ingest Processor.

    You can only apply pipelines that are in the Healthy status.


It can take a few minutes for the Ingest Processor to finish applying your pipeline. During this time, all applied pipelines will have a Pending status. Once the operation is complete, the Pending Apply status icon (Image of pending status icon) stops displaying beside the pipeline, and all affected pipelines transition from the Pending status back to the Healthy status. Refresh your browser to check if the Pending Apply status icon (Image of pending status icon) no longer displays.

Last modified on 21 January, 2025
Extract fields from event data using Ingest Processor   Extract JSON fields from data using Ingest Processor

This documentation applies to the following versions of Splunk Cloud Platform: 9.1.2308, 9.1.2312, 9.2.2403, 9.2.2406 (latest FedRAMP release), 9.3.2408


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters