Splunk Storm User Manual


Send data with Storm's REST API

Use Storm's REST API

NOTE - Splunk has announced the end-of-life of Splunk Storm. For more details click here.

Use Storm's REST API

This topic provides an overview of how to make REST API calls to add data to a Storm project. Use the /1/inputs/http endpoint to add raw data to a Storm project with the ability to change the source and host default fields.

REST API call overview

When accessing this endpoint, specify the following:

  • Authentication
  • API Address. This is unique to a project. Find it in Project > Data > API. Its format is:
  • Splunk default fields (indexed fields that Splunk Storm automatically recognizes in your event data at search time)
  • Request body (raw event text to be indexed)

Here is an example call using cURL: assuming that $API_ADDRESS is your project address (api-****.data.splunkstorm.com)

curl -u x:<Token> \
  "https://$API_ADDRESS/1/inputs/http?index=<ProjectID>&sourcetype=<type>"  \
  -H "Content-type: text/plain" \
  -d "<Request body>" 


Storm uses an API token authentication scheme over HTTP Basic Authentication with TLS data encryption (HTTPS). For each user, Storm creates an access token that you use as the "password" for access. Any value for username can be used with this token. Storm ignores the username field.

You can use the same token for all Storm projects for which you are an administrator.

Splunk default fields

These are parameters to the endpoint that must be embedded as query parameters in a URL.


The following parameters are required. Storm responds with an HTTP 400 error if either of the required fields is missing.

  • index: Specifies the project ID.
  • sourcetype: Identifies the incoming data. See About source types for information on using source types with Splunk Storm.


You can optionally specify the following parameters. The data input API allows you to override the values normally associated with host and source if you need to use these for data classification.

  • tz: Time zone to use when indexing events.
  • host: The host emitting the data (defaults to the reverse DNS for the source IP).
  • source: The data source (defaults to the source IP of the data).

Request body

The raw event text to input. You can send the raw event text as plain text (text/plain) or as URL-encoded text (application/x-www-form-urlencoded). In either case, Storm correctly interprets the raw input event text.

Build REST API calls to input data

There are several ways to input data into a Storm project using the REST API. The method you choose depends on the type and amount of data you want to send, the amount of physical memory you have on the server sending data, and the destination in your Storm projects for the data.

Basic REST API call

The most basic call sends a single event over a single connection in a single HTTP request. For example, consider a source type that handles log data. Open a connection to the project. For each logging event, send the single logging event as the complete body of the request. You can keep the connection open to send additional logging events.

This trivial use case is typically inefficient. You often send and receive more data in the headers of the request than in the body.

Send multiple events over a single call

In this use case, you buffer multiple events that you send in a single call. Again, consider a source type that handles log data. You may want to buffer many events locally until you reach a threshold, such as the size of the data, a time period to send data, a specific log level (for example, ERROR logs), or some other factor.

If you have collections of events that need different Splunk default fields, such as source, send them in different HTTP requests. You specify these fields in the URL query string, which necessitates the separate requests.

Once you reach the threshold, send the buffered events as the body of a single call. You can also pipeline requests. Keep the TCP connection open, and send multiple HTTP requests.

The advantage of this use case is that you limit the SSL/TLS and HTTP overhead in sending and receiving data. However you have to consider factors such as:

  • Local memory available for buffering data
  • Risk of losing data should a server go down
  • Timeliness of making the new events available in your project

Send compressed data

You may send data more efficiently by uploading a gzip compressed stream. You must supply a "Content-Encoding: gzip" header if sending gzipped data for it to be correctly decompressed on receipt.


echo 'Sun Apr 11 15:35:15 UTC 2011 action=download_packages status=OK pkg_dl=751 elapsed=37.543' \
    | gzip \
    | curl -u x:$ACCESS_TOKEN \
    "https://$API_ADDRESS/1/inputs/http?index=$PROJECT_ID&sourcetype=generic_single_line" \
    -H "Content-type: text/plain" \
    -H "Content-Encoding: gzip" \
    -v --data-binary @-

Example: Input data using cURL

Before sending data to a Storm project, obtain an access token and the Storm project ID from your Storm account. Then build a query to Splunk Storm to input data.

When making a REST API call, the access token is paired with a username. However, Storm ignores the username, using only the access token for authentication.

1. Log in to your Storm project. Navigate to <Project_Name> > Inputs > API .

2. Copy the access token and project ID into your environment along with a choice of source type and whatever event you'd like to send. From a *nix terminal window, these values can be exported as environment variables. The following example specifies generic_single_line for a source type. The full list of source types is available from the About source types documentation.

export ACCESS_TOKEN=<access_token>
export PROJECT_ID=<project_id>
export SOURCETYPE=generic_single_line
export EVENT="Sun Apr 11 15:35:15 UTC 2011 action=download_packages status=OK pkg_dl=751 elapsed=37.543"

3. Use curl to run the HTTP request using the above exported environment variables:

curl -u x:$ACCESS_TOKEN \
  "https://api-****.data.splunkstorm.com/1/inputs/http?index=$PROJECT_ID&sourcetype=$SOURCETYPE" \
  -H "Content-Type: text/plain" \
  -d "$EVENT"

An actual response looks something like this:


Sending a file

You can also use cURL to send an entire file instead of a single event. However, to monitor a file (or multiple files) and send updates to Storm as they happen you should instead use a Splunk Forwarder for reliable transmission.

To send a single one-off file:

1. As with sending a single event in the previous example, export your access token and project id into your *nix terminal with the source type and name of the file you want to send:

export ACCESS_TOKEN=<access_token>
export PROJECT_ID=<project_id>
export SOURCETYPE=access_combined
export FILENAME=/var/log/apache2/access.log

2. Use curl to run the HTTP request using the above exported environment variables:

curl -u x:$ACCESS_TOKEN \
  "https://api-****.data.splunkstorm.com/1/inputs/http?index=$PROJECT_ID&sourcetype=$SOURCETYPE" \
  -H "Content-Type: text/plain" \
  --data-binary @$FILENAME

Depending on the size of the file, curl may take several minutes to transfer the data to Storm.

Note: Although you can transfer large files this way, for files of more than a few hundred megabytes it is recommended to use a Splunk forwarder instead. A forwarder automatically resumes in the case of a transmission error midway through the upload. For more details, see the documentation about forwarding data to Storm.

More examples

For examples using Python and Ruby, read "Examples: input data with Python or Ruby."

We'll be adding more (and longer) code examples in our GitHub repository.

This documentation applies to the following versions of Storm: Storm View the Article History for its revisions.


Here's a NLog (C#/.NET) implementation of the SplunkStorm REST api: https://github.com/HakanL/SplunkNLogTarget

October 3, 2013

And I notice there's no way to edit a comment. I forgot the link: https://github.com/HakanL/splunktracelistener

October 2, 2013

Here's a first cut at a TraceListener (C# System.Diagnostics) implementation for SplunkStorm. It may help some to access the REST API using .NET.

October 2, 2013

Hi Quirk, I don't have an ETA on a search endpoint, but I don't think real work can start on it at least until alerting is good to go.

Jlaw splunk, Splunker
March 27, 2013

Roughly, how far away in time is the search API?

March 27, 2013

Beatscope: http://en.wikipedia.org/wiki/Huzzah

Rachel, Splunker
March 7, 2013

@Rachel: "Huzzah!?!"

February 12, 2013

the API is back! huzzah!

Rachel, Splunker
December 12, 2012

When is the REST API available for posting data? Signed up for a new account today and we are interested ONLY in the REST api to post specific json log statements.

December 1, 2012

Hi, BWRic and Scytacki -<br /><br />The new data input endpoint is currently being tested and hopefully can go live very soon!<br /><br />The search endpoint (Scytacki) is further away but still a priority.<br /><br />Thanks for the questions!

Jlaw splunk, Splunker
November 30, 2012

I'm interested in knowing the API will be back

November 27, 2012

Any idea when this will be enabled again? I need an api for retrieving search results.

October 30, 2012

Thanks for the question, Royatloudlee! Here's an explanation on Splunk Answers:<br /><br />http://splunk-base.splunk.com/answers/43597/how-to-send-multiline-events-to-storm-using-the-api

Jlaw splunk, Splunker
March 22, 2012

How do I send multiple events? What separates the events in the request body? If it's a newline, how do I send multiple multiline events?

March 8, 2012

You must be logged into splunk.com in order to post comments. Log in now.

Was this documentation topic helpful?

If you'd like to hear back from us, please provide your email address:

We'd love to hear what you think about this topic or the documentation as a whole. Feedback you enter here will be delivered to the documentation team.

Feedback submitted, thanks!