Serban Stoenescu
Serban Stoenescu

Reputation: 3876

Java - Failed to specify server's Kerberos principal name

I have some integration tests that use HDFS with Kerberos authentication. When I run them, I get this exception:

java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name; Host Details : local host is: "Serbans-MacBook-Pro.local/1.2.3.4"; destination host is: "10.0.3.33":8020; 
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
    at org.apache.hadoop.ipc.Client.call(Client.java:1472)
    at org.apache.hadoop.ipc.Client.call(Client.java:1399)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
    at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)

I believe that everything is configured correctly:

 System.setProperty("java.security.krb5.realm", "...");
        System.setProperty("java.security.krb5.kdc", "...");

        Configuration conf = new Configuration();
        conf.set("fs.defaultFS", "hdfs://10.0.3.33:8020");
        conf.set("fs.hdfs.impl", "org.apache.hadoop.hdfs.DistributedFileSystem");
        conf.set("hadoop.security.authentication", "Kerberos");
        UserGroupInformation.setConfiguration(conf);
        UserGroupInformation.loginUserFromKeytab("user@...", "/Users/user/user.keytab");

What do you believe the problem is? On my host (10.0.3.33) I have core-site.xml and hdfs-site.xml configured correctly. But I am not running from that host, as the exception suggests.

Any ideas what to do, in order to be able to run the tests from any host?

Thanks, Serban

Upvotes: 1

Views: 5097

Answers (1)

Kaushal
Kaushal

Reputation: 3367

If you are using Hadoop's older version less than 2.6.2, default pattern property is not available in hdfs-site.xml file then you need to specify pattern property manually.

config.set("dfs.namenode.kerberos.principal.pattern", "hdfs/*@BDBIZVIZ.COM"); 

Upvotes: 4

Related Questions