Splunk® Enterprise

REST API Tutorials

Creating searches using the REST API

Use the search/jobs endpoint to create a search job in a Splunk deployment. However, before creating searches you should be aware of how searches work and how to structure a search so you can easily access the results.

Learn about searches

You can learn about searches and how to write them in the Search Manual.

Here are some highlights from the Search Reference that can help you get started creating searches:

Additional information about working with searches is in the Search Manual:

REST endpoints for searches

Here is a brief description of some of the key endpoints for creating and accessing searches.

/search/jobs
Create searches or access the results of search jobs. Returns a search ID (sid) that you use when accessing the results of a search.

/search/jobs/export
Stream search results as they become available. Does not create a search ID for later access.

/search/jobs/{search_id}/control
Execute a job control command for a search, such as pause, setpriority, or finalize.

/search/jobs/{search_id}/events
Return untransformed events of a search.

/search/jobs/{search_id}/results
Return transformed events of a search.

/search/jobs/{search_id}/summary
Return summary information for fields of a search.

/search/jobs/{search_id}/timeline
Return event distribution over time of the so-far-read untransformed events.

/saved/searches
Create or access the configuration of saved searches.

Tips on creating searches

When running searches using the REST API, it is best practice to add time modifiers to limit the search time range. By default, REST API searches run over the alltime time range and can be inefficient. For more information, see Time Modifiers in the Search Reference manual.

When creating a search (POST /search/jobs), consider the following properties of the search:

max_count
Set this parameter for searches returning more than the default maximum of 10000 events. Otherwise you may not be able to retrieve results in excess of the default.

status_buckets
To access summary and timeline information from a search job, specify a value for status_buckets. The default value is zero. For example, searches spawned from the Splunk timeline specify status_buckets=300.

rf
Use the rf parameter to add required fields to a search. Adding fields guarantees results for the endpoints that return events and a summary. (The required_fields parameter has been deprecated in favor of the rf parameter.)

earliest_time and latest_time
Use these parameters to add time modifiers and specify an appropriate time range for a search. By default, REST API searches run over the alltime time range.

Tips on accessing searches

When accessing the results of a search (GET /search/jobs/{search_id}), consider the following:

search, offset, and count parameters
Use these parameters to a GET operation to filter or limit the results returned.

dispatchState
dispatchState is one of the properties returned when accessing a search. It provides the state of a search, which can be any of the following:

QUEUED
PARSING
RUNNING
FINALIZING
DONE
PAUSE
INTERNAL_CANCEL
USER_CANCEL
BAD_INPUT_CANCEL
QUIT
FAILED

search job properties
The GET operation for /search/jobs returns all the properties of a search. These properties are described in the parameters to the POST operation. Search job properties are also described in View search job properties in the Search Manual.

performance
The GET operation for /search/jobs returns information that helps you troubleshoot the efficiency of a search. See the "Execution costs" section in View search job properties in the Search Manual.

Example: Create a search

Many calls to Splunk's API involve running some kind of search. For example, you may wish to run a search within Splunk Enterprise and POST the results to a third party application. Use the endpoints located at the ../services/search/<endpoint> URIs.

When you run a search, the search process launches asynchronously. You can poll the jobs or events endpoint to see if your search has finished.

Create a search job

Create a search job using the POST operation at search/jobs/. Set your search as the POST payload. For example:

curl -u admin:changeme -k https://localhost:8089/services/search/jobs -d search="search *"

This simple example runs the search for *. It returns an XML response such as:

<?xml version='1.0' encoding='UTF-8'?>
<response>
  <sid>1258421375.19</sid>
</response>

You need the search ID to retrieve the search, which is returned within the <sid> tags. In the example above this is 1258421375.19.

Check status of a search

Check the status of a search job by accessing the GET operation of search/jobs/. If you know the search's ID, you can access search/jobs/{search_id} to get information about that search only:

curl -u admin:changeme -k https://localhost:8089/services/search/jobs/1258421375.19 

If you're not sure what searches you're running, the GET operation at search/jobs endpoint returns a list of searches with their search IDs.

curl -u admin:changeme -k https://localhost:8089/services/search/jobs/ 

Get search results

Use the results endpoint located at /search/jobs/<sid>/results/ to retrieve your search results. This endpoint returns results only when your search has completed. You can also get output from the events endpoint located at /search/jobs/{search_id}/events/ while your search is still running. For complete search results, use the results endpoint.

You can return search results in JSON, CSV or XML by setting the output_mode parameter. By default, results are returned in XML format.

For example, to retrieve search results in CSV format, make the following call.

Note: The curl listing includes --get because you are passing a parameter to a GET operation.

curl -u admin:changeme \
     -k https://localhost:8089/services/search/jobs/1258421375.19/results/ \
     --get -d output_mode=csv

Note: This is one method that you can use to export large numbers of search results. For more information about exporting search results, as well as information about the other export methods, see "Export search results" in the Search Manual.

Python example

This example script authenticates against a Splunk server and runs a search query in Python. After running the search, the script returns the search ID (sid). This script has been made cross-compatible with Python 2 and Python 3 using python-future.

from __future__ import print_function
from future import standard_library
standard_library.install_aliases()
import urllib.request, urllib.parse, urllib.error
import httplib2
from xml.dom import minidom

baseurl = 'https://localhost:8089'
userName = 'admin'
password = 'changeme'

searchQuery = '| inputcsv foo.csv | where sourcetype=access_common | head 5'

# Authenticate with server.
# Disable SSL cert validation. Splunk certs are self-signed.
serverContent = httplib2.Http(disable_ssl_certificate_validation=True).request(baseurl + '/services/auth/login',
    'POST', headers={}, body=urllib.parse.urlencode({'username':userName, 'password':password}))[1]

sessionKey = minidom.parseString(serverContent).getElementsByTagName('sessionKey')[0].childNodes[0].nodeValue

# Remove leading and trailing whitespace from the search
searchQuery = searchQuery.strip()

# If the query doesn't already start with the 'search' operator or another
# generating command (e.g. "| inputcsv"), then prepend "search " to it.
if not (searchQuery.startswith('search') or searchQuery.startswith("|")):
    searchQuery = 'search ' + searchQuery
    
print(searchQuery)

# Run the search.
# Again, disable SSL cert validation.
print(httplib2.Http(disable_ssl_certificate_validation=True).request(baseurl + '/services/search/jobs','POST',
    headers={'Authorization': 'Splunk %s' % sessionKey},body=urllib.parse.urlencode({'search': searchQuery}))[1])
Last modified on 25 October, 2023
Accessing and updating Splunk Enterprise configurations  

This documentation applies to the following versions of Splunk® Enterprise: 9.0.0, 9.0.1, 9.0.2, 9.0.3, 9.0.4, 9.0.5, 9.0.6, 9.0.7, 9.0.8, 9.0.9, 9.0.10, 9.1.0, 9.1.1, 9.1.2, 9.1.3, 9.1.4, 9.1.5, 9.1.6, 9.1.7, 9.2.0, 9.2.1, 9.2.2, 9.2.3, 9.2.4, 9.3.0, 9.3.1, 9.3.2, 9.4.0


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters