Splunk® Enterprise Security

Splunk Enterprise Security Tutorials

The documentation for Splunk Enterprise Security versions 8.0 and higher have been rearchitected from previous versions, causing some links to have redirect errors. For documentation on version 8.0, see Splunk Enterprise Security documentation homepage.
This documentation does not apply to the most recent version of Splunk® Enterprise Security. For documentation on the most recent version, go to the latest release.

Part 3: Create the correlation search in guided mode

After you define the title, app context, and description of the search, it is time to build it. The best way to build a correlation search with syntax that parses and works as expected is to use guided search creation mode.

Open the guided search creation wizard

  1. From the correlation search editor, click Guided for the Mode setting.
  2. Click Continue to open the guided search editor.

Select the data source for the search

Start your correlation search by choosing a data source.

  1. For the Data source field, select the source for your data.
    • Select Data Model if your data is stored in a data model. The data model defines which objects, or datasets, the correlation search can use as a data source.
    • Select Lookup File if your data is stored in a lookup. If you select a lookup file for the Source, then you need to select a lookup file by name.
    To recreate the Excessive Failed Logins search, select Data Model.
  2. In the Data model list, select the data model that contains the security-relevant data for your search. Select the Authentication data model because it contains login-relevant data.
  3. In the Dataset list, select the Failed_Authentication dataset. The Excessive Failed Logins search is looking for failed logins, and that information is stored in this data model dataset.
  4. For the Summaries only field, click Yes to restrict the search to accelerated data only.
  5. Select a Time range of events for the correlation search to scan for excessive failed logins. Select a preset relative time range of Last 60 minutes. The time range depends on the security use case for the search. Excessive failed logins are more of a security issue if they occur during a one hour time span, whereas one hour might not be a long enough time span to catch other security incidents.
  6. Click Preview to review the first portion of the search.
    This screen image shows the guided correlation search editor with a preview of the Authentication data model and Failed Authentication dataset search.
  7. Click Next to continue building the search.

Filter the data with a where clause

Filter the data that the correlation search examines for a match using a where clause. The search applies the filter before applying statistics.

The Excessive Failed Logins search by default does not include any where clause filters, but you can add one if you want to focus on failed logins for specific hosts, users, or authentication types.

The search preview shows you if the correlation search string can be parsed. The search string appends filter commands as you type them, letting you see if the filter command is a valid where clause. You can run the search to see if it returns the results that you expect. If the where clause filters on a data model dataset such as Authentication.dest, enclose the data model dataset with single quotes. For example, a where clause that excludes authentication events where the destination is local host would look as follows: | where 'Authentication.dest'!="127.0.0.1".

  1. Leave the Filter field blank and click Next.
    This screen image shows the guided search editor where clause filter page.

Analyze your data with statistical aggregates

Analyze your data with statistical aggregates. Each aggregate is a function that applies to a specific attribute in a data model or field in a lookup file. Use the aggregates to identify the statistics that are relevant to your use case.

For example, the Excessive Failed Logins correlation search uses four statistical aggregate functions to surface the important data points needed to define alerting thresholds. For this search, the aggregates identify the following:

  • Tags associated with the authentication attempts.
  • Number of users involved.
  • Number of destinations involved.
  • Total count of attempts.

To replicate this search, create the aggregates.

Create the tags aggregate

Identify the successes and failures in authentication attempts with tags.

  1. Click Add a new aggregate.
  2. Select the values function from the Function list.
  3. Select Authentication.tag from the Field list.
  4. Type tag in the Alias field.

This aggregate retrieves all the values for the Authentication.tag dataset.

Create the user count aggregate

Identify the number of distinct users involved.

  1. Click Add a new aggregate.
  2. Select the dc function from the Function list.
  3. Select Authentication.user from the Field list.
  4. Type user_count in the Alias field.

This aggregate retrieves a distinct count of users.

Create the destination count aggregate

Identify the number of distinct destinations involved.

  1. Click Add a new aggregate.
  2. Select the dc function from the Function list.
  3. Select Authentication.dest from the Field list.
  4. Type dest_count in the Alias field.

This aggregate retrieves a distinct count of devices that are the destination of authentication activities.

Create a total count aggregate

Identify the overall count.

  1. Click Add a new aggregate.
  2. Select the count function from the Function list.
  3. Leave the attribute and alias fields empty.

This aggregate identifies the total count for statistical analysis.

Fields to split by

Identify the fields that you want to split the aggregate results by. Split-by fields define the fields that you want to group the aggregate results by. For example, you care more about excessive failed logins if the users were logging into the same application and from the same source. In order to get more specific notable events and to avoid over-alerting, define split-by fields for the aggregate search results.

Split the aggregates by application.

  1. Click Add a new split-by.
  2. From the Fields list, select Authentication.app.
  3. Type app in the Alias field.

Split the aggregates by source.

  1. Click Add a new split-by.
  2. From the Fields list, select Authentication.src.
  3. Type src in the Alias field.

This screen image shows the aggregates and split-by fields completed.

Click Next to define the correlation search match criteria.

You can find information on split-by fields in the Splunk platform documentation.

  • For Splunk Enterprise, see Optional arguments in the Splunk Enterprise Search Reference.
  • For Splunk Cloud Platform, see Optional arguments in the Splunk Cloud Platform Search Reference.

Define the correlation search match criteria for analysis

Identify the criteria that define a match for the correlation search. The correlation search performs an action when the search results match predefined conditions. Define the statistical function to use to look for a match.

For Excessive Failed Logins, when a specific user has six or more failed logins from the same source and attempting to log in to the same application, the correlation search identifies a match and takes action.

  1. In the Field list, select the function count. The Field list is populated by the attributes used in the aggregates and with the fields used in the split-by.
  2. In the Comparator list, select Greater than or equal to.
  3. In the Value field, type 6.
  4. Click Next.

Test the correlation search string

The guided mode wizard ensures that your search string parses and produces events. You can run the search to see if it returns the preliminary results that you expect.

The correlation search results must include at least one event to generate a notable.

  1. Open a new tab in your browser and navigate to the Splunk platform Search page.
  2. Run the correlation search to validate that it produces events that match your expectations.
    1. If your search does not parse, but parsed successfully on the filtering step, return to the correlation search guided editor aggregates and split-bys to identify errors.
    2. If your search parses but does not produce events that match your expectations, adjust the elements of your search as needed.
  3. After you validate your search string on the search page, return to the guided search editor and click Done to return to the correlation search editor.

Next Step

Part 4: Schedule the correlation search.

Last modified on 12 July, 2022
Part 2: Create a correlation search   Part 4: Schedule the correlation search

This documentation applies to the following versions of Splunk® Enterprise Security: 7.0.1, 7.0.2, 7.1.0, 7.1.1, 7.1.2, 7.2.0, 7.3.0, 7.3.1, 7.3.2


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters