This documentation does not apply to the most recent version of Splunk. Click here for the latest version.
Calculates an expression and puts the resulting value into a field.
- Syntax: <string>
- Description: A field name for your evaluated value.
- Syntax: <string>
- Description: A combination of values, variables, operators, and functions that represent the value of your destination field. The syntax of the expression is checked before running the search, and an exception will be thrown for an invalid expression. For example, the result of an eval statement is not allowed to be boolean. If Splunk cannot evaluate the expression successfully at search-time for a given event, eval erases the value in the result field.
The following table lists the basic operations you can perform with eval. For these evaluations to work, your values need to be valid for the type of operation. For example, with the exception of addition, arithmetic operations may not produce valid results if the values are not numerical. When concatenating values, Splunk reads the values as strings (regardless of their value).
The eval command includes the following functions:
abs(), case(), ceil() , ceiling(), cidrmatch(), coalesce(), commands(), exact(), exp(), floor(), if(), ifnull(), isbool(), isint(), isnotnull(), isnull(), isnum(), isstr(), len(), like(), ln(), log(), lower(), ltrim(), match(), max(), md5(), min(), mvappend(), mvcount(), mvindex(), mvfilter(), mvjoin(), mvzip(), now(), null(), nullif(), pi(), pow(), random(), relative_time(), replace(), round(), rtrim(), searchmatch(), sigfig(), spath(), split(), sqrt(), strftime(), strptime(), substr(), time(), tonumber(), tostring(), trim(), typeof(), upper(), urldecode(), validate().
For descriptions and examples of each function, see "Functions for eval and where".
Performs an evaluation of arbitrary expressions that can include mathematical, string, and boolean operations. The eval command requires that you specify a field name that takes the results of the expression you want to evaluate. If this destination field matches a field name that already exists, the values of the field are replaced by the results of the eval expression.
If you are using a search as an argument to the
eval command and functions, you cannot use a saved search name; you have to pass a literal search string or a field that contains a literal search string (like the 'search' field extracted from index=_audit events).
This example shows how you might coalesce a field from two different source types and use that to create a transaction of events.
sourcetype=A has a field called
sourcetype=B has the same information in a field called
sourcetype=A OR sourcetype=B | eval phone=coalesce(number,subscriberNumber) | transaction phone maxspan=2m
eval command is used to add a common field, called
phone, to each of the events whether they are from
sourcetype=B. The value of
phone is defined, using the
coalesce() function, as the values of
coalesce() function takes the value of the first non-NULL field (that means, it exists in the event).
Now, you're able to group events from either source type
B if they share the same
|This example uses recent (September 23-29, 2010) earthquake data downloaded from the USGS Earthquakes website. The data is a comma separated ASCII text file that contains the source network (Src), ID (Eqid), version, date, location, magnitude, depth (km) and number of reporting stations (NST) for each earthquake over the last 7 days.
Download the text file, M 2.5+ earthquakes, past 7 days, save it as a CSV file, and upload it to Splunk. Splunk should extract the fields automatically. Note that you'll be seeing data from the 7 days previous to your download, so your results will vary from the ones displayed below.
Earthquakes occurring at a depth of less than 70 km are classified as shallow-focus earthquakes, while those with a focal-depth between 70 and 300 km are commonly termed mid-focus earthquakes. In subduction zones, deep-focus earthquakes may occur at much greater depths (ranging from 300 up to 700 kilometers).
Classify recent earthquakes based on their depth.
source=eqs7day-M1.csv | eval Description=case(Depth<=70, "Shallow", Depth>70 AND Depth<=300, "Mid", Depth>300, "Deep") | table Datetime, Region, Depth, Description
eval command is used to create a field called
Description, which takes the value of "Shallow", "Mid", or "Deep" based on the
Depth of the earthquake. The
case() function is used to specify which ranges of the depth fits each description. For example, if the depth is less than 70 km, the earthquake is characterized as a shallow-focus quake; and the resulting
The search also pipes the results of
eval into the
table command. This formats a table to display the timestamp of the earthquake, the region in which it occurred, the depth in kilometers of the quake, and the corresponding description assigned by the
|This example uses the sample dataset from this topic in the tutorial but should work with any format of Apache Web access log. Download the data set from this topic in the tutorial and follow the instructions to upload it to Splunk. Then, run this search using the time range, Other > Yesterday.|
Search for IP addresses and classify the network they belong to.
sourcetype=access_* | eval network=if(cidrmatch("192.0.0.0/16", clientip), "local", "other")
This example uses the
cidrmatch() function to compare the IP addresses in the
clientip field to a subnet range. The search also uses the
if() function, which says that if the value of
clientip falls in the subnet range, then
network is given the value
eval command does not do any special formatting to your results -- it just creates a new field which takes the value based on the eval expression. After you run this search, use the field picker to add the
network field to your results. Now you can see, inline with your search results, which IP addresses are part of your
local network and which are not. Your events list should look something like this:
Another option for formatting your results is to pipe the results of
eval to the
table command to display only the fields of interest to you. (See Example 1)
Note: This example just illustrates how to use the
cidrmatch function. If you want to classify your events and quickly search for those events, the better approach is to use event types. Read more about event types in the Knowledge manager manual.
|This example uses generated email data (|
Use the email address field to extract the user's name and domain.
sourcetype="cisco_esa" mailfrom=* | eval accountname=split(mailfrom,"@") | eval from_user=mvindex(accountname,0) | eval from_domain=mvindex(accountname,-1) | table mailfrom, from_user, from_domain
This example uses the
split() function to break the
mailfrom field into a multivalue field called
accountname. The first value of
accountname is everything before the "@" symbol, and the second value is everything after.
The example then uses
mvindex() function to set
from_domain to the first and second values of
The results of the
eval expressions are then piped into the
table command. You can see the the original
mailfrom values and the new
from_domain values in the following results table:
Note: This example is really not that practical. It was written to demonstrate how to use an
eval function to identify the individual values of a multivalue fields. Because this particular set of email data did not have any multivalue fields, the example creates one (
accountname) from a single value field (
|This example uses generated email data (|
This example classifies where an email came from based on the email address's domain: .com, .net, and .org addresses are considered local, while anything else is considered abroad. (Of course, domains that are not .com/.net/.org or not necessarily from abroad.)
sourcetype="cisco_esa" mailfrom=*| eval accountname=split(mailfrom,"@") | eval from_domain=mvindex(accountname,-1) | eval location=if(match(from_domain, "[^\n\r\s]+\.(com|net|org)"), "local", "abroad") | stats count by location
The first half of this search is similar to Example 3. The
split() function is used to break up the email address in the
mailfrom field. The
mvindex function defines the
from_domain as the portion of the
mailfrom field after the
match() functions are used: if the
from_domain value ends with a
.com, .net., or .org, the
location field is assigned
from_domain does not match,
location is assigned
eval results are then piped into the
stats command to count the number of results for each
location value and produce the following results table:
After you run the search, you can add the
location fields to your events to see the classification inline with your events. If your search results contain these fields, they will look something like this:
Note: This example merely illustrates using the
match() function. If you want to classify your events and quickly search for those events, the better approach is to use event types. Read more about event types in the Knowledge manager manual.
|This example uses the sample dataset from this topic in the tutorial. Download the data set from this topic in the tutorial and follow the instructions to upload it to Splunk. Then, run this search using the time range, Other > Yesterday.|
Reformat a numeric field measuring time in seconds into a more readable string format.
sourcetype=access_* | transaction clientip maxspan=10m | eval durationstr=tostring(duration,"duration")
This example uses the
tostring() function and the duration option to convert the
duration of the transaction into a more readable string formatted as HH:MM:SS. The
duration is the time between the first and last events in the transaction and is given in seconds.
The search defines a new field,
durationstr, for the reformatted
duration value. After you run the search, you can use the Field picker to show the two fields inline with your events. If your search results contain these fields, they will look something like this:
Example 1: Change the format of the event's time and sort the results in descending order by new time.
... | bucket _time span=60m | eval Time=strftime(_time, "%m/%d %H:%M %Z") | stats avg(time_taken) AS AverageResponseTime BY Time | sort - Time
Example 2: Set status to OK if error is 200; otherwise, Error.
... | eval status = if(error == 200, "OK", "Error")
Example 3: Set lowuser to the lowercase version of username.
... | eval lowuser = lower(username)
Example 4: Set sum_of_areas to be the sum of the areas of two circles
... | eval sum_of_areas = pi() * pow(radius_a, 2) + pi() * pow(radius_b, 2)
Example 5: Set status to some simple http error codes.
... | eval error_msg = case(error == 404, "Not found", error == 500, "Internal Server Error", error == 200, "OK")
Example 6: Set full_name to the concatenation of first_name, a space, and last_name.
... | eval full_name = first_name." ".last_nameSearch
Example 7: Display timechart of the avg of cpu_seconds by processor rounded to 2 decimal places.
... | timechart eval(round(avg(cpu_seconds),2)) by processor
Example 8: Convert a numeric field value to a string with commas and 2 decimal places. If the original value of x is 1000000, this returns x as 1,000,000.
... | eval x=tostring(x,"commas")
Have questions? Visit Splunk Answers and see what questions and answers the Splunk community has using the eval command.