I am pretty new to Kerberos. I am using a shared Hadoop cluster. My admin has provided me with a username and a password and the location of the KDC server.
Is it possible to use just the JAVA GSS API and the Hadoop USerGroupInformation class to access the Hadoop cluster.
For a non-kerberos hadoop cluster, this would be the code snippet I would use to, say read a file from HDFS :
String uname = <Some username>;
UserGroupInformation ugi = UserGroupInformation.createRemoteUser(uname);
ugi.doAs(new PrivilegedExceptionAction<Void>(){
public Void run() throws Exception {
HdfsConfiguration hdfsConf = new HdfsConfiguration();
... SETUP Configuration ...
FileSystem fs = FileSystem.get(hdfsConf);
... Use 'fs' to read/write etc ...
}
});
Now for a secure cluster, I am also provided the kerberos password for the user.. Could someone please provide me with what are the exact changes I need to make to the above code snippet to first access the KDC and do a kinit and then proceed with the HDFS operation
Please remember, in the environment I am planning to deploy the Java app, I might not have 'kinit' installed locally, so invoking the kinit process from within java is not an option.
Thanks in advance..