Reputation: 3990
Configured HDFS with kerberos in windows 8.
Namenode successfully login and started. Datanode not getting started but login successfully.
Exception
14/12/10 17:51:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/12/10 17:52:00 INFO security.UserGroupInformation: Login successful for user [email protected] using keytab file C:\kumar.keytab
14/12/10 17:52:00 INFO impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
14/12/10 17:52:00 INFO impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
14/12/10 17:52:00 INFO impl.MetricsSystemImpl: DataNode metrics system started
14/12/10 17:52:05 INFO datanode.DataNode: Configured hostname is hostname.WORKGROUP
14/12/10 17:52:05 FATAL datanode.DataNode: Exception in secureMainjava.lang.RuntimeException: Cannot start secure cluster without privileged resources. at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:737)
Looking at the link particularly the below content. I need to set JSVC_HOME. How to set that? Where can i get the JSVC package. I need solution to solve this problem.
Secure DataNode
Because the data transfer protocol of DataNode does not use the RPC framework of Hadoop, DataNode must authenticate itself by using privileged ports which are specified by dfs.datanode.address and dfs.datanode.http.address. This authentication is based on the assumption that the attacker won't be able to get root privileges.
When you execute hdfs datanode command as root, server process binds privileged port at first, then drops privilege and runs as the user account specified by HADOOP_SECURE_DN_USER. This startup process uses jsvc installed to JSVC_HOME. You must specify HADOOP_SECURE_DN_USER and JSVC_HOME as environment variables on start up (in hadoop-env.sh).
Upvotes: 3
Views: 5238
Reputation: 1043
It seems the JSVC package is not available for Windows. After taking a long time, I figured out a solution for this problem. Instead of using JSVC we can use SASL.
Hadoop-2.6.0 supports SASL for authenticating the data transfer protocol. Refer to the documentation: Secure Datanode.
To quote that link:
As of version 2.6.0, SASL can be used to authenticate the data transfer protocol. In this configuration, it is no longer required for secured clusters to start the DataNode as root using jsvc and bind to privileged ports. To enable SASL on data transfer protocol, set
dfs.data.transfer.protection
inhdfs-site.xml
, set a non-privileged port fordfs.datanode.address
, setdfs.http.policy
toHTTPS_ONLY
and make sure theHADOOP_SECURE_DN_USER
environment variable is not defined. Note that it is not possible to use SASL on data transfer protocol ifdfs.datanode.address
is set to a privileged port. This is required for backwards-compatibility reasons.
Upvotes: 5