Ingest Processor is currently released as a preview only and is not officially supported. See Splunk General Terms for more information. For any questions on this preview, please reach out to ingestprocessor@splunk.com.
Extract timestamps from event data using Ingest Processor
You can use a pipeline to extract timestamp fields and also convert those timestamps into specific formats. Extracting the Timestamp field lets you visualize events by time and convert timestamps into the appropriate format before sending it to a destination.
For example, if you have data that contains timestamps with multiple formats, you can convert the timestamp information from your data into a specific format directly from the pipeline editor. This is helpful when certain destinations require you to store timestamps using a particular format.
To convert a timestamp, you must write a SPL2 pipeline statement for extracting timestamps from your data and then convert your timestamp to normalize the data from those fields.
Steps
- Navigate to the Pipelines page and then select New pipeline, then Ingest Processor pipeline.
- On the Get started page, select Blank pipeline and then Next.
- On the Define your pipeline's partition page, do the following:
- Select how you want to partition your incoming data that you want to send to your pipeline. You can partition by source type, source, and host.
- Enter the conditions for your partition, including the operator and the value. Your pipeline will receive and process the incoming data that meets these conditions.
- Select Next to confirm the pipeline partition.
- To specify some sample data for your pipeline, do the following:
- On the Add sample data page, enter or upload sample data for generating previews that show how your pipeline processes data. The sample data must contain accurate examples of the values that you want to extract into fields.
For example, the following sample events represent purchases made at a store at a particular time:
E9FF471F36A91031FE5B6D6228674089, 72E0B04464AD6513F6A613AABB04E701, Credit Card, 7.7, 2023-01-13 04:41:00, 2023-01-13 04:45:00, -73.997292, 40.720982, 4532038713619608 A5D125F5550BE7822FC6EE156E37733A, 08DB3F9FCF01530D6F7E70EB88C3AE5B, Credit Card,14, 2023-01-13 04:37:00, 2023-01-13 04:47:00, -73.966843,40.756741, 4539385381557252 1E65B7E2D1297CF3B2CA87888C05FE43,F9ABCCCC4483152C248634ADE2435CF0, Game Card, 16.5, 2023-01-13 04:26:00, 2023-01-13 04:46:00, -73.956451, 40.771442
- Select Next to confirm any sample data that you want to use for your pipeline.
- On the Select destination dataset page, select the name of the destination that you want to send data to, then do the following:
- If you selected a Splunk platform destination, select Next.
- If you selected another type of destination, select Done and skip the next step.
- (Optional) If you're sending data to a Splunk platform deployment, you can specify a target index:
- In the Index name field, select the name of the index that you want to send your data to.
- (Optional) In some cases, incoming data already specifies a target index. If you want your Index name selection to override previous target index settings, then select the Overwrite previously specified target index check box.
- Select Done.
- To generate a preview of how your pipeline processes data based on the sample data that you provided, select the Preview Pipeline icon ().
- Select the plus icon () in the Actions section, then select Extract fields from _raw.
- In the Extract fields from _raw dialog box, do the following:
- In the Regular expression field, specify one or more named capture groups using Regular Expression 2 (RE2) syntax. The name of the capture group determines the name of the extracted field, and the matched values determine the values of the extracted field. You can select named capture groups from the Insert from library list or enter named capture groups directly in the field.
For example, to extract timestamps from the sample events described previously, selectTimestamp_ISO8601
from the Insert from library list. The resulting regular expression looks like this: - (Optional) By default, the regular expression matches are case sensitive. To make the matches case insensitive, uncheck the Match case check box.
- Use the Events preview pane to validate your regular expression. The events in this pane are based on the last time that you generated a pipeline preview, and the pane highlights the values that match your regular expression for extraction.
- When you are satisfied with the events highlighted in the Event preview pane, select Apply to perform the timestamp extraction.
A
rex
command is added to the SPL2 pipeline statement, and the new field appears in theFields
list. - Convert the extracted timestamp to a different format and store it in a
_time
field by doing the following: - In the preview results panel, hover over the header of the
Timestamp_ISO8601
field and then select the Options for "Timestamp_ISO8601" icon (). Then, select Convert _time from Timestamp_ISO8601. - In the Source timestamp format field, specify the current format of the timestamps using the time variables supported in SPL2. See Using time variables in the SPL2 Search Manual for more information.
- Select Apply.
- (Optional) After storing your event timestamps in the
_time
field, you can remove theTimestamp_ISO8601
field. To do this, add the followingfields
command to your SPL2 pipeline statement. Make sure to place this command after your field extraction and conversion expressions. For example:| fields - Timestamp_ISO8601
- To save your pipeline, do the following:
- Select Save pipeline.
- In the Name field, enter a name for your pipeline.
- (Optional) In the Description field, enter a description for your pipeline.
- Select Save.
- To apply this pipeline, do the following:
If you're sending data to a Splunk platform deployment, be aware that the destination index is determined by a precedence order of configurations.
(?P<Timestamp_ISO8601>(?:\d\d){1,2}-(?:0?[1-9]|1[0-2])-(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])[T ](?:2[0123]|[01]?[0-9]):?(?:[0-5][0-9])(?::?(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?))?(?:Z|[+-](?:2[0123]|[01]?[0-9])(?::?(?:[0-5][0-9])))?)
For more information about the supported regular expression syntax, see Regular expression syntax for Ingest Processor pipelines.
A new _time
field is added to your events. This field stores the timestamp as the Unix format, but continues to display the timestamp in a human readable format. This ensures an accurate timestamp value even if you access the data from a different time zone. See the following pages for more information:
It can take a few minutes for the Ingest Processor to finish applying your pipeline. During this time, all applied pipelines will have a Pending status. Once the operation is complete, the Pending Apply status icon () stops displaying beside the pipeline. Refresh your browser to check if the Pending Apply status icon () no longer displays.
PREVIOUS Route subsets of data using Ingest Processor |
NEXT Getting sample data for previewing data transformations |
This documentation applies to the following versions of Splunk Cloud Platform™: 9.1.2308 (latest FedRAMP release), 9.1.2312
Feedback submitted, thanks!