Splunk Cloud Platform

Use Edge Processors

Acrobat logo Download manual as PDF


Acrobat logo Download topic as PDF

Extract timestamps from event data using an Edge Processor

You can use a pipeline to extract timestamp fields and also convert those timestamps into specific formats. Extracting the Timestamp field lets you visualize events by time and convert timestamps into the appropriate format before sending it to a destination.

For example, if you have data that contains timestamps with multiple formats, you can convert the timestamp information from your data into a specific format directly from the pipeline editor. This is helpful when certain destinations require you to store timestamps using a particular format.

To convert a timestamp, you must write a SPL2 pipeline statement for extracting timestamps from your data and then convert your timestamp to normalize the data from those fields. See Extract fields from event data using an Edge Processor for more information on field extraction.


Steps

  1. Navigate to the Pipelines page and then select New pipeline.
  2. Select Blank pipeline and then select Next.
  3. Specify a subset of the data received by the Edge Processor for this pipeline to process. To do this, you must define a partition by completing these steps:
    1. Select the plus icon (This image shows an icon of a plus sign.) next to Partition or select the option that matches how you would like to create your partition in the Suggestions section.
    2. In the Field field, specify the event field that you want the partitioning condition to be based on.
    3. To specify whether the pipeline includes or excludes the data that meets the criteria, select Keep or Remove.
    4. In the Operator field, select an operator for the partitioning condition.
    5. In the Value field, enter the value that your partition should filter by to create the subset. Then select Apply. You can create as many conditions for a partition in a pipeline by selecting the plus icon (This image shows an icon of a plus sign.).
    6. Once you have defined your partition, select Next.
  4. Enter or upload sample data for generating previews that show how your pipeline processes data. The sample data must contain accurate examples of the values that you want to extract into fields.

  5. For example, the following sample events represent purchases made at a store at a particular time:

    E9FF471F36A91031FE5B6D6228674089, 72E0B04464AD6513F6A613AABB04E701, Credit Card, 7.7, 2023-01-13 04:41:00, 2023-01-13 04:45:00, -73.997292, 40.720982, 4532038713619608
    A5D125F5550BE7822FC6EE156E37733A, 08DB3F9FCF01530D6F7E70EB88C3AE5B, Credit Card,14, 2023-01-13 04:37:00, 2023-01-13 04:47:00, -73.966843,40.756741, 4539385381557252
    1E65B7E2D1297CF3B2CA87888C05FE43,F9ABCCCC4483152C248634ADE2435CF0, Game Card, 16.5, 2023-01-13 04:26:00, 2023-01-13 04:46:00, -73.956451, 40.771442		
    
  6. Select the name of the destination that you want to send data to. Then, do the following:
    1. If you selected a Splunk platform S2S or Splunk platform HEC destination, select Next.
    2. If you selected another type of destination, select Done and skip the next step.
  7. (Optional) If you're sending data to a Splunk platform deployment, you can specify a target index:
    1. In the Index name field, select the name of the index that you want to send your data to.
    2. (Optional) In some cases, incoming data already specifies a target index. If you want your Index name selection to override previous target index settings, then select the Overwrite previously specified target index check box.
    3. Select Done.
    4. Be aware that the destination index is determined by a precedence order of configurations. See How does an Edge Processor know which index to send data to? for more information.

  8. To generate a preview of how your pipeline processes data based on the sample data that you provided, select the Preview Pipeline icon (Image of the Preview Pipeline icon).
  9. Select the plus icon (This image shows an icon of a plus sign.) from the Actions section, then select Extract fields from _raw.
  10. In the Extract fields from _raw dialog box, do the following:
    1. In the Regular expression field, specify one or more named capture groups using Regular Expression 2 (RE2) syntax. The name of the capture group determines the name of the extracted field, and the matched values determine the values of the extracted field. You can select named capture groups from the Insert from library list or enter named capture groups directly in the field.
      For example, to extract timestamps from the sample events described previously, select Timestamp_ISO8601 from the Insert from library list. The resulting regular expression looks like this:
    2. (?P<Timestamp_ISO8601>(?:\d\d){1,2}-(?:0?[1-9]|1[0-2])-(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])[T ](?:2[0123]|[01]?[0-9]):?(?:[0-5][0-9])(?::?(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?))?(?:Z|[+-](?:2[0123]|[01]?[0-9])(?::?(?:[0-5][0-9])))?)
      
    3. (Optional) By default, the regular expression matches are case sensitive. To make the matches case insensitive, uncheck the Match case check box.
    4. Use the Events preview pane to validate your regular expression. The events in this pane are based on the last time that you generated a pipeline preview, and the pane highlights the values that match your regular expression for extraction.
    5. When you are satisfied with the events highlighted in the Events preview pane, select Apply to perform the timestamp extraction.

    A rex command is added to the SPL2 pipeline statement, and the new field appears in the Fields list.

  11. Convert the extracted timestamp to a different format and store it in a _time field by doing the following:
    1. In the preview results panel, hover over the header of the Timestamp_ISO8601 field and then select the Options for "Timestamp_ISO8601" icon (Image of the Options icon). Then, select Convert _time from Timestamp_ISO8601.
    2. In the Source timestamp format field, specify the current format of the timestamps using the time variables supported in SPL2. See Using time variables in the SPL2 Search Manual for more information.
    3. Select Apply.
  12. A new _time field is added to your events. This field stores the timestamp as the Unix format, but continues to display the timestamp in a human readable format. This ensures an accurate timestamp value even if you access the data from a different time zone. See the following pages for more information:

  13. (Optional) After storing your event timestamps in the _time field, you can remove the Timestamp_ISO8601 field. To do this, add the following fields command to your SPL2 pipeline statement. Make sure to place this command after your field extraction and conversion expressions.
    | fields - Timestamp_ISO8601
    
  14. To save your pipeline, do the following:
    1. Select Save pipeline.
    2. In the Name field, enter a name for your pipeline.
    3. (Optional) In the Description field, enter a description for your pipeline.
    4. Select Save.
  15. To apply this pipeline to an Edge Processor, do the following:
    1. Navigate to the Pipelines page.
    2. In the row that lists your pipeline, select the Options icon (Image of the Options icon) and then select Apply/Remove.
    3. Select the Edge Processors that you want to apply the pipeline to, and then select Save.

    You can only apply pipelines to Edge Processors that have a healthy status.

    It can take a few minutes for the Edge Processor service to finish applying your pipeline to an Edge Processor. During this time, all Edge Processors that the pipeline is applied to will have a Pending status. To confirm that the process completed successfully, do the following:

    • Navigate to the Edge Processors page. Then, verify that the Instance health column for the affected Edge Processors shows that all instances are back in the Healthy status.
    • Navigate to the Pipelines page. Then, verify that the Applied column for the pipeline contains a The pipeline is applied icon (Image of the "applied pipeline" icon).

    You might need to refresh your browser to see the latest updates.

Example: Extract and convert timestamp fields and drop all other data values

Consider the following events representing purchases and the time it was made at a store:

E9FF471F36A91031FE5B6D6228674089, 72E0B04464AD6513F6A613AABB04E701, Credit Card, 7.7, 2023-01-13 04:41:00 AM, 2023-01-13 04:45:00 AM, -73.997292, 40.720982, 4532038713619608
A5D125F5550BE7822FC6EE156E37733A, 08DB3F9FCF01530D6F7E70EB88C3AE5B, Credit Card, 14, 2023-01-13 04:37:00 AM, 2023-01-13 04:47:00 AM, -73.966843,40.756741, 4539385381557252
1E65B7E2D1297CF3B2CA87888C05FE43,F9ABCCCC4483152C248634ADE2435CF0, Game Card, 16.5, 2018-01-13 04:26:00 AM, 2023-01-13 04:46:00 AM, -73.956451, 40.771442

You only want to keep the information about when the purchase was made. You want to convert the timestamp into a different format and then drop the rest of the data, which includes confidential information such as credit card numbers.

In the Extract fields from _raw dialog box, do the following:

  1. To extract the timestamp of the purchase into a field named Timestamp_ISO8601, from the Insert from library list, select Timestamp_ISO8601.
  2. In the preview results panel, select the header of the Timestamp_ISO8601 field to make the Options icon (Image of the Options icon) appear. Select that icon to open the Options menu and then select Convert Timestamp_ISO8601 to update _time.
  3. Specify a Source timestamp format from the menu and click Apply.
  4. (Optional) Drop the Timestamp_ISO8601 field and only keep the _time field that you extracted by adding the following fields command in the SPL2 statement of the pipeline. Make sure to place this command after the commands that are used for extracting and converting the timestamp.
  5. | fields - Timestamp_ISO8601
    

After completing these steps, you'll have a pipeline with the following SPL2 statement:

$pipeline = | from $source 
| rex field=_raw /(?P<Timestamp_ISO8601>(?:\d\d){1,2}-(?:0?[1-9]|1[0-2])-(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])[T ](?:2[0123]|[01]?[0-9]):?(?:[0-5][0-9])(?::?(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?))?(?:Z|[+-](?:2[0123]|[01]?[0-9])(?::?(?:[0-5][0-9])))?)/
| eval _time = strptime(Timestamp_ISO8601, "%Y-%m-%d %H:%M:%S")
| fields - Timestamp_ISO8601
| into $destination;

The preview results show the following timestamps from your events:

_time
8:41:00 PM

12 Jan 2023

8:37:00 PM

12 Jan 2023

8:26:00 PM

12 Jan 2023

Last modified on 24 January, 2024
PREVIOUS
Extract fields from event data using an Edge Processor
  NEXT
Apply pipelines to Edge Processors

This documentation applies to the following versions of Splunk Cloud Platform: 9.0.2209, 9.0.2303, 9.0.2305, 9.1.2308 (latest FedRAMP release), 9.1.2312


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters