Splunk® Enterprise

Splunk Analytics for Hadoop

Splunk Enterprise version 8.2 is no longer supported as of September 30, 2023. See the Splunk Software Support Policy for details. For information about upgrading to a supported version, see How to upgrade Splunk Enterprise.

Add or edit an HDFS provider in Splunk Web

Splunk Analytics for Hadoop reaches End of Life on January 31, 2025.

You can set up multiple providers with multiple indexes for one provider. When you add a virtual index, have the following information at hand:

  • The host name and port for the NameNode of the Hadoop cluster.
  • The host name and port for the JobTracker of the Hadoop cluster.
  • Installation directories of Hadoop command line libraries and Java installation.
  • Path to a writable directory on the DataNode/TaskTracker *nix filesystem, the one for which the Hadoop user account has read and write permission.
  • Path to a writable directory in HDFS that can be used exclusively by Splunk on this search head.

You can also add HDFS providers and virtual indexes by editing indexes.conf. See Set up a virtual index for instructions on setting up virtual indexes in the configuration file.

Add a provider

1. In the top menu, select Settings > Virtual Indexes.

2. Select the Providers tab in the Virtual Indexes page and click New Provider or the name of the provider you want to edit.

3. The Add New/Edit Provider page, give your provider a Name.

4. Select the Provider Family in the drop down list (note that this field cannot be edited).

5. Provide the following Environment Variables:

  • Java Home: provide the path to your Java instance.
  • Hadoop Home: Provide the path to your Hadoop client directory.

6. Provide the following Hadoop Cluster Information:

  • Hadoop Version: Tell Splunk Analytics for Hadoop which version of Hadoop the cluster is running one of: Hadoop 1.0, Hadoop 2.0 with MRv1 or Hadoop 2.0 with Yarn.
  • JobTracker: Provide the path to the Job Tracker.
  • File System: Provide the path to the default file system.

7. Provide the following Splunk Settings:

  • HDFS working directory: This is a path in HDFS (or whatever the default file system is) that you want to use as a working directory.
  • Job queue: This is job queue where you want the MapReduce jobs for this provider to be submitted to.

8. Click Add Secure Cluster to configure security for the cluster and provide your Kerberos Server configuration.

9. The Additional Settings fields specify your provider configuration variables. Splunk Analytics for Hadoop populates these preset configuration variables for each provider you create. You can leave the preset variables in place or edit them as needed. If you want to learn more about these settings, see Provider Configuration Variables in the reference section of this manual.

Note: If you are configuring Splunk Analytics for Hadoop to work with YARN, you must add new settings. See Required configuration variables for YARN in this manual.

9. Click Save.

Last modified on 30 October, 2023
Set up a virtual index in the configuration file   Add or edit a virtual index in Splunk Web

This documentation applies to the following versions of Splunk® Enterprise: 7.0.0, 7.0.1, 7.0.2, 7.0.3, 7.0.4, 7.0.5, 7.0.6, 7.0.7, 7.0.8, 7.0.9, 7.0.10, 7.0.11, 7.0.13, 7.1.0, 7.1.1, 7.1.2, 7.1.3, 7.1.4, 7.1.5, 7.1.6, 7.1.7, 7.1.8, 7.1.9, 7.1.10, 7.2.0, 7.2.1, 7.2.2, 7.2.3, 7.2.4, 7.2.5, 7.2.6, 7.2.7, 7.2.8, 7.2.9, 7.2.10, 7.3.0, 7.3.1, 7.3.2, 7.3.3, 7.3.4, 7.3.5, 7.3.6, 7.3.7, 7.3.8, 7.3.9, 8.0.0, 8.0.1, 8.0.2, 8.0.3, 8.0.4, 8.0.5, 8.0.6, 8.0.7, 8.0.8, 8.0.9, 8.0.10, 8.1.0, 8.1.1, 8.1.2, 8.1.3, 8.1.4, 8.1.5, 8.1.6, 8.1.7, 8.1.8, 8.1.9, 8.1.10, 8.1.11, 8.1.12, 8.1.13, 8.1.14, 8.2.0, 8.2.1, 8.2.2, 8.2.3, 8.2.4, 8.2.5, 8.2.6, 8.2.7, 8.2.8, 8.2.9, 8.2.10, 8.2.11, 8.2.12, 9.0.0, 9.0.1, 9.0.2, 9.0.3, 9.0.4, 9.0.5, 9.0.6, 9.0.7, 9.0.8, 9.0.9, 9.0.10, 9.1.0, 9.1.1, 9.1.2, 9.1.3, 9.1.4, 9.1.5, 9.1.6, 9.1.7, 9.2.0, 9.2.1, 9.2.2, 9.2.3, 9.2.4, 9.3.0, 9.3.1, 9.3.2


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters