Using the Machine Learning Toolkit
The Machine Learning Toolkit provides the following features:
- Search command extensions that have been added to the Splunk Search Processing Language (SPL) to perform machine learning analytics on data such as fitting and applying a model, along with commands to list, summarize, and delete learned models. For details, see Search commands for machine learning.
- Custom visualizations, which are reusable visualizations for viewing and analyzing data in a particular format. For details, see Custom visualizations.
- Assistants, which are dashboards that guide you through the machine learning workflow. Each assistant features a different algorithm to fit and apply a model, with custom visualizations to help you interpret the results.
- A Showcase of examples that display different sample datasets in the assistants for you to explore machine-learning concepts. Each example prepopulates an assistant to demonstrate how to perform different types of machine learning analysis and prediction using best practices.
Explore the Showcase examples
If you want to jump right in and explore, go to the Showcase page and open the examples, organized by type of analytic. Each example uses a sample dataset to demonstrate aspects of machine learning. By default all examples are displayed, but you can filter them by use case:
- IT
- Security
- Business
- Internet of things
When you click an example, the corresponding assistant is then populated with dataset options that correspond to the analytic.
For a tour of the Showcase, click the icon in the top right corner on the Showcase page.
For more about each example, see Showcase examples.
About the assistants
The assistants in the Machine Learning Toolkit are there for you to use with your own data. In each assistant, perform a lookup on a dataset, then follow the work flow to select fields to predict and fit the model.
Each assistant contains the following sections that vary depending on the type of machine learning analytic being performed:
- Create or Detect: Follow the workflow laid out in the assistant to create a new model or forecast or detect outliers. The workflow depends on the type of analytic but usually includes performing a lookup on a dataset, selecting a field to predict or analyze, and selecting fields or values to use for performing different types of analysis.
- Load Existing Settings: The assistants keep a history of the settings you use each time you use an assistant, so you can view and compare the results of each attempt to use the settings from more successful configurations.
- Raw Data Preview: This section is displayed for predictions and forecasts to show you the data that is being used.
- Validate: Use the tables and visualizations to determine how well the model was fitted, how well outliers were detected, or how well a forecast performed.
- Deploy: Click the buttons beneath the visualizations and tables to see different ways to use the analysis. For example, you can open the search in the Search app, show the SPL, or create an alert.
For a tour of an assistant, click the icon in the top right corner on the assistant page.
For details about the assistants, see:
- Predict Numeric Fields
- Predict Categorical Fields
- Detect Numeric Outliers
- Detect Categorical Outliers
- Forecast Time Series
- Cluster Numeric Events
Machine Learning Toolkit files
To view the source code for the Machine Learning Toolkit app, see $SPLUNK_HOME/etc/apps/Splunk_ML_Toolkit
on Unix-based systems or %SPLUNK_HOME%\etc\apps\Splunk_ML_Toolkit
on Windows systems.
Subdirectory | Description |
---|---|
/appserver/static and /bin | Contains the underlying code files (Python, JavaScript, CSS, and images). |
/default | Contains configuration and dashboard files. |
/lookups | Contains the sample datasets used in the Showcase examples, along with more information about the datasets and their licenses. |
/models | Contains the learned models. |
Install the Machine Learning Toolkit | Showcase examples |
This documentation applies to the following versions of Splunk® Machine Learning Toolkit: 2.0.0
Feedback submitted, thanks!