This repository has been archived by the owner on Feb 10, 2021. It is now read-only.
-
-
Notifications
You must be signed in to change notification settings - Fork 41
Kerberos Testing
Kristopher Overholt edited this page Feb 26, 2016
·
3 revisions
Add a principal account for the hdfs user:
$ sudo kadmin.local
- addprinc hdfs@CONTINUUM
- PASSWORD: continuum
Test authentication to HDFS via the command line:
$ hadoop fs -ls /
$ kinit hdfs@CONTINUUM
- PASSWORD: continuum
$ hadoop fs -ls /
$ kdestroy
$ hadoop fs -ls /
Get the hdfs-client.xml
config file from https://github.com/PivotalRD/libhdfs3/wiki/Configure-Parameters and place it in your working directory. You'll need to change the hadoop.security.authentication
property from simple
to kerberos
:
<!-- RPC client configuration -->
<property>
<name>hadoop.security.authentication</name>
<value>kerberos</value>
<description>
the RPC authentication method, valid values include "simple" or "kerberos". default is "simple"
</description>
</property>
To make use of the local hdfs-client.xml
configuration settings, unset the LIBHDFS3_CONF
env variable if you have set it.
$ unset LIBHDFS3_CONF
Note:
You might also be able to use these settings rather than the above two steps.
pars={'hadoop.security.authentication': 'kerberos'}
Now, you can authenticate and use hdfs3
within the Kerberized HDFS:
$ kinit hdfs@CONTINUUM
$ ipython
In [1]: import hdfs3
In [2]: f = hdfs3.HDFileSystem('ip-172-31-63-222.ec2.internal', port=8020)
In [3]: f.ls('/')
In [4]: f.df()
View the tickets that hdfs3/libhdfs3 have generated:
$ klist
Ticket cache: FILE:/tmp/krb5cc_1000
Default principal: hdfs@CONTINUUM
Valid starting Expires Service principal
01/14/2016 06:08:49 01/14/2016 16:08:49 krbtgt/CONTINUUM@CONTINUUM
renew until 01/21/2016 06:08:47
01/14/2016 06:08:57 01/14/2016 16:08:49 nn/ip-172-31-63-222.ec2.internal@
renew until 01/21/2016 06:08:47
01/14/2016 06:08:57 01/14/2016 16:08:49 nn/ip-172-31-63-222.ec2.internal@CONTINUUM
renew until 01/21/2016 06:08:47