About pass-through authentication
Splunk Analytics for Hadoop reaches End of Life on January 31, 2025.
To search a virtual index, Splunk Analytics for Hadoop requests MapReduce jobs and accesses HDFS files. By default, Splunk Analytics for Hadoop does this as the Splunk Analytics for Hadoop superuser. Using pass-through authentication, however, you can control which Splunk Analytics for Hadoop users submit MapReduce jobs and access HDFS files. You can also specify which queue the MapReduce jobs should use.
About Splunk Analytics for Hadoop Superusers, Splunk Analytics for Hadoop users, and Hadoop users
A Splunk Analytics for Hadoop Superuser is one (or all) of the following:
- The user used to install the Splunk search head.
- The Kerberos keytab user for a provider.
Hadoop users are users which a Hadoop cluster lets:
- Submit MapReduce jobs.
- Access HDFS, assuming it uses the operating system users/groups for your nodes, as Hadoop generally does by default. (It may work differently if you configured your Hadoop cluster differently.)
How pass-through authentication works with your users
Pass-through authentication lets you make the Splunk Analytics for Hadoop Superuser a proxy for any number of configured Splunk Analytics for Hadoop users. This way, Splunk Analytics for Hadoop users can act as Hadoop users to own the associated jobs, tasks, and files in Hadoop (and you can limit access to files in HDFS.)
Splunk Analytics for Hadoop users can be created via Splunks native user functionality or LDAP. For more information about setting up users, see the following topics in the Securing Splunk Enterprise manual:
Ways you can use pass-through authentication
The following use cases describes common ways you might use pass-through authentication:
- One Splunk Enterprise for Hadoop user to one Hadoop user: For example, you can configure your Splunk Analytics for Hadoop user to act as a Hadoop user associated with a specific queue or data set. In this case you simply map your user to a specific user in Hadoop. For example, if the Splunk Analytics for Hadoop user name is "msantos", on the Hadoop cluster, the name is "mattsantos" and queue is "Products."
- Many Splunk Analytics for Hadoop user to one Hadoop Users: You might want multiple Splunk Analytics for Hadoop users to act as a Hadoop user. For example, all of the below Splunk Analytics for Hadoop users could run as the "Executive" user on Hadoop, and are assigned the queue "Products:"
- jbartlett
- lmcgarry
- jlyman
- Splunk Analytics for Hadoop user(s) to same Hadoop user with different queues: You can also run Splunk Analytics for Hadoop users as the same Hadoop user, but assign them different queues. For example, users in the last example run as the "Executive" user on Hadoop, and are assigned the queue "Products". But are respectively assigned to the following queues:
- jbartlett runs as user "Executive" and is assigned the queue "prod-admin."
- lmcgarry runs as user "Executive" and is assigned the queue "prod-staff."
- jlyman runs as user "Executive" and is assigned the queue "prod-staff."
Configure Kerberos authentication | Configure pass-through authentication in Splunk Web |
This documentation applies to the following versions of Splunk® Enterprise: 7.0.0, 7.0.1, 7.0.2, 7.0.3, 7.0.4, 7.0.5, 7.0.6, 7.0.7, 7.0.8, 7.0.9, 7.0.10, 7.0.11, 7.0.13, 7.1.0, 7.1.1, 7.1.2, 7.1.3, 7.1.4, 7.1.5, 7.1.6, 7.1.7, 7.1.8, 7.1.9, 7.1.10, 7.2.0, 7.2.1, 7.2.2, 7.2.3, 7.2.4, 7.2.5, 7.2.6, 7.2.7, 7.2.8, 7.2.9, 7.2.10, 7.3.0, 7.3.1, 7.3.2, 7.3.3, 7.3.4, 7.3.5, 7.3.6, 7.3.7, 7.3.8, 7.3.9, 8.0.0, 8.0.1, 8.0.2, 8.0.3, 8.0.4, 8.0.5, 8.0.6, 8.0.7, 8.0.8, 8.0.9, 8.0.10, 8.1.0, 8.1.1, 8.1.2, 8.1.3, 8.1.4, 8.1.5, 8.1.6, 8.1.7, 8.1.8, 8.1.9, 8.1.10, 8.1.11, 8.1.12, 8.1.13, 8.1.14, 8.2.0, 8.2.1, 8.2.2, 8.2.3, 8.2.4, 8.2.5, 8.2.6, 8.2.7, 8.2.8, 8.2.9, 8.2.10, 8.2.11, 8.2.12, 9.0.0, 9.0.1, 9.0.2, 9.0.3, 9.0.4, 9.0.5, 9.0.6, 9.0.7, 9.0.8, 9.0.9, 9.0.10, 9.1.0, 9.1.1, 9.1.2, 9.1.3, 9.1.4, 9.1.5, 9.1.6, 9.1.7, 9.2.0, 9.2.1, 9.2.2, 9.2.3, 9.2.4, 9.3.0, 9.3.1, 9.3.2
Feedback submitted, thanks!