
table
Description
The table
command returns a table that is formed by only the fields that you specify in the arguments. Columns are displayed in the same order that fields are specified. Column headers are the field names. Rows are the field values. Each row represents an event.
The table
command is similar to the fields
command in that it lets you specify the fields you want to keep in your results. Use table
command when you want to retain data in tabular format.
With the exception of a scatter plot to show trends in the relationships between discrete values of your data, you should not use the table
command for charts. See Usage.
Syntax
table <wc-field-list>
Arguments
- <wc-field-list>
- Syntax: <wc-field> ...
- Description: A list of valid field names. The list can be space-delimited or comma-delimited. You can use the asterisk ( * ) as a wildcard to specify a list of fields with similar names. For example, if you want to specify all fields that start with "value", you can use a wildcard such as
value*
.
Usage
The table
command is a transforming command. See Command types.
Visualizations
To generate visualizations, the search results must contain numeric, datetime, or aggregated data such as count, sum, or average.
Command type
The table
command is a non-streaming command. If you are looking for a streaming command similar to the table
command, use the fields
command.
Field renaming
The table
command doesn't let you rename fields, only specify the fields that you want to show in your tabulated results. If you're going to rename a field, do it before piping the results to table
.
Truncated results
The table
command truncates the number of results returned based on settings in the limits.conf file. In the [search] stanza, if the value for the truncate_report
parameter is 1, the number of results returned is truncated.
The number of results is controlled by the max_count
parameter in the [search] stanza. If truncate_report
is set to 0, the max_count
parameter is not applied.
Examples
Example 1
This example uses recent earthquake data downloaded from the USGS Earthquakes website. The data is a comma separated ASCII text file that contains magnitude (mag), coordinates (latitude, longitude), region (place), etc., for each earthquake recorded.
You can download a current CSV file from the USGS Earthquake Feeds and add it as an input to your Splunk deployment. |
Search for recent earthquakes in and around California and display only the time of the quake (Datetime
), where it occurred (Region
), and the quake's magnitude (Magnitude
) and depth (Depth
).
index=usgs_* source=usgs place=*California | table time, place, mag, depth
This simply reformats your events into a table and displays only the fields that you specified as arguments.
Example 2
This example uses recent earthquake data downloaded from the USGS Earthquakes website. The data is a comma separated ASCII text file that contains magnitude (mag), coordinates (latitude, longitude), region (place), etc., for each earthquake recorded.
You can download a current CSV file from the USGS Earthquake Feeds and add it as an input to your Splunk deployment. |
Show the date, time, coordinates, and magnitude of each recent earthquake in Northern California.
index=usgs_* source=usgs place=*California | rename lat as latitude lon as longitude | table time, place, lat*, lon*, mag
This example begins with a search for all recent earthquakes in Northern California (Region="Northern California"
).
Then it pipes these events into the rename
command to change the names of the coordinate fields, from lat
and lon
to latitude
and longitude
. (The table
command doesn't let you rename or reformat fields, only specify the fields that you want to show in your tabulated results.)
Finally, it pipes the results into the table
command and specifies both coordinate fields with lat*
, lon*
, the magnitude with mag
, and the date and time with time
.
This example just illustrates how the table
command syntax allows you to specify multiple fields using the asterisk wildcard.
Example 3
This example uses the sample dataset from the tutorial but should work with any format of Apache Web access log. Download the data set from the Add data tutorial and follow the instructions to get the sample data into your Splunk deployment. Then, run this search using the time range, All time. |
Search for IP addresses and classify the network they belong to.
sourcetype=access_* | dedup clientip | eval network=if(cidrmatch("192.0.0.0/16", clientip), "local", "other") | table clientip, network
This example searches for Web access data and uses the dedup
command to remove duplicate values of the IP addresses (clientip
) that access the server. These results are piped into the eval
command, which uses the cidrmatch()
function to compare the IP addresses to a subnet range (192.0.0.0/16). This search also uses the if()
function, which says that if the value of clientip
falls in the subnet range, then network is given the value local. Otherwise, network=other
.
The results are then piped into the table
command to show only the distinct IP addresses (clientip
) and the network classification (network
):
More examples
Example 1: Create a table with the fields host, action, and all fields that start with 'value'.
... | table host action value*
See Also
PREVIOUS streamstats |
NEXT tags |
This documentation applies to the following versions of Splunk Cloud Platform™: 8.2.2106, 8.2.2107, 8.2.2111, 8.2.2112, 8.2.2109, 8.2.2201, 8.2.2202, 8.2.2203, 9.0.2205, 9.0.2208, 9.0.2209 (latest FedRAMP release)
Feedback submitted, thanks!