Splunk® Enterprise

Search Reference

Splunk Enterprise version 7.0 is no longer supported as of October 23, 2019. See the Splunk Software Support Policy for details. For information about upgrading to a supported version, see How to upgrade Splunk Enterprise.
This documentation does not apply to the most recent version of Splunk® Enterprise. For documentation on the most recent version, go to the latest release.

eval

Description

The eval command calculates an expression and puts the resulting value into a search results field.

  • If the field name that you specify does not match a field in the output, a new field is added to the search results.
  • If the field name that you specify matches a field name that already exists in the search results, the results of the eval expression overwrite the values in that field.

The eval command evaluates mathematical, string, and boolean expressions.

You can chain multiple eval expressions in one search using a comma to separate subsequent expressions. The search processes multiple eval expressions left-to-right and lets you reference previously evaluated fields in subsequent expressions.


Difference between eval and stats commands

The stats command calculates statistics based on fields in your events. The eval command creates new fields in your events by using existing fields and an evaluation expression.

This screen image shows two tables and an example of the eval command in between the tables. The first table shows 4 columns: A, B, C, D. The example eval command is ... eval E = ... The second table shows 5 columns. The E column is added to the right side of the table. This shows that the eval command adds columns to your output. In this example the E column is added.

Syntax

eval <field>=<expression>["," <field>=<expression>]...

Required arguments

field
Syntax: <string>
Description: A destination field name for the resulting calculated value. If the field name already exists in your events, eval overwrites the value.
expression
Syntax: <string>
Description: A combination of values, variables, operators, and functions that are run to determine the value to place in your destination field.
The syntax of the eval expression is checked before running the search, and an exception is thrown for an invalid expression.
  • The result of an eval expression cannot be Boolean. If, at search time, the expression cannot be evaluated successfully for a given event, eval erases the resulting field.
  • If the expression references a field name that contains non-alphanumeric characters, it needs to be surrounded by single quotation marks. For example, if the field name is server-1 you specify the field name like this new=count+'server-1'.
  • If the expression references literal strings it needs to be surrounded by double quotation marks. For example, if the string you want to use is server- you specify the string like this new="server-".host.

Usage

The eval command is a distributable streaming command. See Command types.

General

When you use the eval command, you must specify a field name. The results of the <expression> that you want to evaluate are placed in that field.

If the field name that you specify matches a field name that already exist, the values in the existing field are replaced by the results of the eval expression.

Number and string values that are returned by the <expression> can be added to the field that you specify. Expressions that evaluate to Boolean values cannot be added to the field. However, you can convert Boolean and null values to strings by using the tostring() function. The resulting string values can then been addd to the field.

During calculations, numbers are double precision floating point numbers subject to all the usual behaviors of floating point numbers. Operations resulting in NaN assigned to a field will result in "nan". Positive and negative overflow will result in "inf" and "-inf". Division by zero will result in a null field.

If you are using a search as an argument to the eval command and functions, you cannot use a saved search name; you must pass a literal search string or a field that contains a literal search string (like the 'search' field extracted from index=_audit events).

Functions

You can use a wide range of functions with the eval command. For general information about using functions, see Evaluation functions.

The following table lists the supported functions by type of function. Use the links in the table to learn more about each function, and to see examples.

Type of function Supported functions and syntax
Comparison and Conditional functions case(X,"Y",...)

cidrmatch("X",Y)
coalesce(X,...)
false()
if(X,Y,Z)

in(VALUE-LIST)

like(TEXT, PATTERN)
match(SUBJECT, "REGEX")
null()

nullif(X,Y)

searchmatch(X)
true()
validate(X,Y,...)

Conversion functions printf("format",arguments)
tonumber(NUMSTR,BASE)
tostring(X,Y)
Cryptographic functions md5(X)

sha1(X)

sha256(X)
sha512(X)
Date and Time functions now()

relative_time(X,Y)

strftime(X,Y)

strptime(X,Y)

time()
Informational functions isbool(X)

isint(X)
isnotnull(X)

isnull(X)

isnum(X)

isstr(X)

typeof(X)

Mathematical functions abs(X)

ceiling(X)
exact(X)
exp(X)

floor(X)

ln(X)
log(X,Y)
pi()

pow(X,Y)

round(X,Y)
sigfig(X)
sqrt(X)

Multivalue eval functions commands(X)

mvappend(X,...)
mvcount(MVFIELD)
mvdedup(X)

mvfilter(X)

mvfind(MVFIELD,"REGEX")
mvindex(MVFIELD,STARTINDEX,ENDINDEX)
mvjoin(MVFIELD,STR)

mvrange(X,Y,Z)

mvsort(X)
mvzip(X,Y,"Z")
split(X,"Y")

Statistical eval functions max(X,...)
min(X,...)
random()
Text functions len(X)

lower(X)
ltrim(X,Y)
replace(X,Y,Z)

rtrim(X,Y)

spath(X,Y)
substr(X,Y,Z)
trim(X,Y)

upper(X)

urldecode(X)

Trigonometry and Hyperbolic functions acos(X)

acosh(X)
asin(X)
asinh(X)
atan(X)

atan2(X,Y)

atanh(X)
cos(X)
cosh(X)
hypot(X,Y)

sin(X)

sinh(X)
tan(X)
tanh(X)

Operators

The following table lists the basic operations you can perform with the eval command. For these evaluations to work, the values need to be valid for the type of operation. For example, with the exception of addition, arithmetic operations might not produce valid results if the values are not numerical. When concatenating values, Splunk software reads the values as strings, regardless of the value.

Type Operators
Arithmetic + - * / %
Concatenation .
Boolean AND OR NOT XOR < > <= >= != = == LIKE

Operators that produce numbers

  • The plus ( + ) operator accepts two numbers for addition, or two strings for concatenation.
  • The subtraction ( - ), multiplication ( * ), division ( / ), and modulus ( % ) operators accept two numbers.

Operators that produce strings

  • The period ( . ) operator concatenates both strings and number. Numbers are concatenated in their string represented form.

Operators that produce booleans

  • The AND, OR, and XOR operators accept two Boolean values.
  • The <>, <=, !=, and == operators accept two numbers or two strings. The != and == operators accept two numbers or two strings. The single equal sign ( = ) is a synonym for the double equal sign ( == ).
  • The LIKE operator accepts two strings. This is a pattern match similar to what is used in SQL. For example string LIKE pattern. The pattern operator supports literal text, a percent ( % ) character for a wildcard, and an underscore ( _ ) character for a single character match. For example, field LIKE "a%b_" matches any string starting with a, followed by anything, followed by b, followed by one character.

Field names

To specify a field name with multiple words, you can either concatenate the words, or use single quotation marks when you specify the name. For example, to specify the field name Account ID you can specify AccountID or 'Account ID'.

To specify a field name with special characters, such as a period, use single quotation marks. For example, to specify the field name Last.Name use 'Last.Name'.

You can use the value of another field as the name of the destination field by using curly brackets, { }. For example, if you have an event with the following fields, aName=counter and aValue=1234. Use | eval {aName}=aValue to return counter=1234.

Calculated fields

You can use eval statements to define calculated fields by defining the eval statement in props.conf. If you are using Splunk Cloud, you can define calculated fields using Splunk Web, by choosing Settings > Fields > Calculated Fields. When you run a search, Splunk software evaluates the statements and creates fields in a manner similar to that of search time field extraction. Setting up calculated fields means that you no longer need to define the eval statement in a search string. Instead, you can search on the resulting calculated field directly.

You can use calculated fields to move your commonly used eval statements out of your search string and into props.conf, where they will be processed behind the scenes at search time. With calculated fields, you can change the search from:

sourcetype="cisco_esa" mailfrom=* | eval accountname=split(mailfrom,"@"), from_user=mvindex(accountname,0), from_domain=mvindex(accountname,-1) | table mailfrom, from_user, from_domain

to this search:

sourcetype="cisco_esa" mailfrom=* | table mailfrom, from_user, from_domain

In this example, the three eval statements that were in the search--that defined the accountname, from_user, and from_domain fields--are now computed behind the scenes when the search is run for any event that contains the extracted field mailfrom field. You can also search on those fields independently once they're set up as calculated fields in props.conf. You could search on from_domain=email.com, for example.

For more information about calculated fields, see About calculated fields in the Knowledge Manager Manual.

Search event tokens

If you are using the eval command in search event tokens, some of the evaluation functions might be unavailable or have a different behavior. See Custom logic for search tokens in Dashboards and Visualizations for information about the evaluation functions that you can use with search event tokens.

Basic Examples

1. Create a new field that contains the result of a calculation

Create a new field called velocity in each event. Calculate the velocity by dividing the values in the distance field by the values in the time field.

... | eval velocity=distance/time

2. Use the if function to determine the values placed in the status field

Create a field called status in each event. Using the if function, set the value in the status field to OK if the error value is 200. Otherwise set the status value to Error.

... | eval status = if(error == 200, "OK", "Error")

3. Convert values to lowercase

Create a new field in each event called lowuser. Using the lower function, populate the field with the lowercase version of the values in the username field.

... | eval lowuser = lower(username)

4. Use the value of one field as the name for a new field

In this example, use each value of the field counter to make a new field name. Assign to the new field the value of the Value field. See Field names under the Usage section.

index=perfmon sourcetype=Perfmon* counter=* Value=* | eval {counter} = Value

5. Set sum_of_areas to be the sum of the areas of two circles

... | eval sum_of_areas = pi() * pow(radius_a, 2) + pi() * pow(radius_b, 2)

6. Set status to some simple http error codes

... | eval error_msg = case(error == 404, "Not found", error == 500, "Internal Server Error", error == 200, "OK")

7. Concatenate values from two fields

Use the period ( . ) character to concatenate the values in first_name field with the values in the last_name field. Quotation marks are used to insert a space character between the two names. When concatenating, the values are read as strings, regardless of the actual value.

... | eval full_name = first_name." ".last_name

8. Separate multiple eval operations with a comma

You can specify multiple eval operations by using a comma to separate the operations. In the following search the full_name evaluation uses the period ( . ) character to concatenate the values in the first_name field with the values in the last_name field. The low_name evaluation uses the lower function to convert the full_name evaluation into lowercase.

... | eval full_name = first_name." ".last_name, low_name = lower(full_name)

9. Display timechart of the avg of cpu_seconds by processor, rounded to 2 decimals

... | timechart eval(round(avg(cpu_seconds),2)) by processor

10. Convert a numeric field value to a string with commas and 2 decimals

If the original value of x is 1000000, this returns x as 1,000,000.

... | eval x=tostring(x,"commas")

To include a currency symbol at the beginning of the string:

... | eval x="$".tostring(x,"commas")

This returns x as $1,000,000.

Extended Examples

11. Coalesce a field from two different source types, create a transaction of events

This example shows how you might coalesce a field from two different source types and use that to create a transaction of events. sourcetype=A has a field called number, and sourcetype=B has the same information in a field called subscriberNumber.

sourcetype=A OR sourcetype=B | eval phone=coalesce(number,subscriberNumber) | transaction phone maxspan=2m

The eval command is used to add a common field, called phone, to each of the events whether they are from sourcetype=A or sourcetype=B. The value of phone is defined, using the coalesce() function, as the values of number and subscriberNumber. The coalesce() function takes the value of the first non-NULL field (that means, it exists in the event).

Now, you're able to group events from either source type A or B if they share the same phone value.

12. Separate events into categories, count and display minimum and maximum values

This example uses recent earthquake data downloaded from the USGS Earthquakes website. The data is a comma separated ASCII text file that contains magnitude (mag), coordinates (latitude, longitude), region (place), and so forth, for each earthquake recorded.

You can download a current CSV file from the USGS Earthquake Feeds and upload the file to your Splunk instance if you want follow along with this example.

Earthquakes occurring at a depth of less than 70 km are classified as shallow-focus earthquakes, while those with a focal-depth between 70 and 300 km are commonly termed mid-focus earthquakes. In subduction zones, deep-focus earthquakes may occur at much greater depths (ranging from 300 up to 700 kilometers).

To classify recent earthquakes based on their depth, you use the following search.

source=usgs | eval Description=case(depth<=70, "Shallow", depth>70 AND depth<=300, "Mid", depth>300, "Deep") | stats count min(mag) max(mag) by Description

The eval command is used to create a field called Description, which takes the value of "Shallow", "Mid", or "Deep" based on the Depth of the earthquake. The case() function is used to specify which ranges of the depth fits each description. For example, if the depth is less than 70 km, the earthquake is characterized as a shallow-focus quake; and the resulting Description is Shallow.

The search also pipes the results of the eval command into the stats command to count the number of earthquakes and display the minimum and maximum magnitudes for each Description. The search results appear in the Statistics tab.

The following table shows an example of the search results.

Description count min(Mag) max(Mag)
Deep 35 4.1 6.7
Mid 635 0.8 6.3
Shallow 6236 -0.60 7.70


13. Find IP addresses and categorize by network using eval functions cidrmatch and if

This example is designed to use the sample dataset from Get the tutorial data into Splunk topic of the Search Tutorial, but it should work with any format of Apache Web access log. Download the data set and follow the instructions in that topic to upload it to your Splunk deployment. Then, run this search using the time range Other > Yesterday.

In this search, you're finding IP addresses and classifying the network they belong to.

sourcetype=access_* | eval network=if(cidrmatch("192.168.0.0/16", clientip), "local", "other")

This example uses the cidrmatch() function to compare the IP addresses in the clientip field to a subnet range. The search also uses the if() function, which says that if the value of clientip falls in the subnet range, then network is given the value local. Otherwise, network=other.

The eval command does not do any special formatting to your results -- it just creates a new field which takes the value based on the eval expression. After you run this search, use the fields sidebar to add the network field to your results. Now you can see, inline with your search results, which IP addresses are part of your local network and which are not. Your events list should look something like this:

EvalEx2 events.png


Another option for formatting your results is to pipe the results of eval to the table command to display only the fields of interest to you. (See Example 13)

Note: This example just illustrates how to use the cidrmatch function. If you want to classify your events and quickly search for those events, the better approach is to use event types. Read more about event types in the Knowledge manager manual.

14. Extract information from an event into a separate field, create a multivalue field

This example uses generated email data (sourcetype=cisco_esa). You should be able to run this example on any email data by replacing the sourcetype=cisco_esa with your data's sourcetype value and the mailfrom field with your data's email address field name (for example, it might be To, From, or Cc).

Use the email address field to extract the user's name and domain. The eval command in this search contains multiple expressions, separated by commas.

sourcetype="cisco_esa" mailfrom=* | eval accountname=split(mailfrom,"@"), from_user=mvindex(accountname,0), from_domain=mvindex(accountname,-1) | table mailfrom, from_user, from_domain

This example uses the split() function to break the mailfrom field into a multivalue field called accountname. The first value of accountname is everything before the "@" symbol, and the second value is everything after.

The example then uses mvindex() function to set from_user and from_domain to the first and second values of accountname, respectively.

The results of the eval expressions are then piped into the table command. You can see the the original mailfrom values and the new from_user and from_domain values in the following results table:

EvalEx4 results.png


Note: This example is really not that practical. It was written to demonstrate how to use an eval function to identify the individual values of a multivalue fields. Because this particular set of email data did not have any multivalue fields, the example creates one (accountname) from a single value field (mailfrom).

15. Categorize events using the match function

This example uses generated email data (sourcetype=cisco_esa). You should be able to run this example on any email data by replacing the sourcetype=cisco_esa with your data's sourcetype value and the mailfrom field with your data's email address field name (for example, it might be To, From, or Cc).

This example classifies where an email came from based on the email address's domain: .com, .net, and .org addresses are considered local, while anything else is considered abroad. (Of course, domains that are not .com/.net/.org are not necessarily from abroad.)

The eval command in this search contains multiple expressions, separated by commas.

sourcetype="cisco_esa" mailfrom=*| eval accountname=split(mailfrom,"@"), from_domain=mvindex(accountname,-1), location=if(match(from_domain, "[^\n\r\s]+\.(com|net|org)"), "local", "abroad") | stats count by location

The first half of this search is similar to Example 12. The split() function is used to break up the email address in the mailfrom field. The mvindex function defines the from_domain as the portion of the mailfrom field after the @ symbol.

Then, the if() and match() functions are used: if the from_domain value ends with a .com, .net., or .org, the location field is assigned local. If from_domain does not match, location is assigned abroad.

The eval results are then piped into the stats command to count the number of results for each location value and produce the following results table:

EvalEx5 resultsTable.png


After you run the search, you can add the mailfrom and location fields to your events to see the classification inline with your events. If your search results contain these fields, they will look something like this:

EvalEx5 eventsList.png


Note: This example merely illustrates using the match() function. If you want to classify your events and quickly search for those events, the better approach is to use event types. Read more about event types in the Knowledge manager manual.

16. Convert the duration of transactions into more readable string formats

This example uses the sample dataset from the Search Tutorial but should work with any format of Apache Web access log. Download the data set from this topic in the Search Tutorial and follow the instructions to upload it to your Splunk deployment. Then, run this search using the time range, Other > Yesterday.

Reformat a numeric field measuring time in seconds into a more readable string format.

sourcetype=access_* | transaction clientip maxspan=10m | eval durationstr=tostring(duration,"duration")

This example uses the tostring() function and the duration option to convert the duration of the transaction into a more readable string formatted as HH:MM:SS. The duration is the time between the first and last events in the transaction and is given in seconds.

The search defines a new field, durationstr, for the reformatted duration value. After you run the search, you can use the Field picker to show the two fields inline with your events. If your search results contain these fields, they will look something like this:

EvalEx4 eventsList.png

Answers

Have questions? Visit Splunk Answers and see what questions and answers the Splunk community has using the eval command.

Last modified on 18 April, 2019
erex   eventcount

This documentation applies to the following versions of Splunk® Enterprise: 7.0.0, 7.0.1, 7.0.2, 7.0.3, 7.0.4, 7.0.5, 7.0.6, 7.0.7, 7.0.8, 7.0.9, 7.0.10, 7.0.11, 7.0.13


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters