Reputation: 321
I'm new to big data, hadoop and linux. We have a small 4 nodes cluster, 1 master and 3 nodes, running on Ambari 2.1 and Hadoop 2.2.6 All machines are running on Ubuntu Server 12.04. All properly configured and all works well. Including DNS, SSH, NTP etc. However, when I've tried to install HUE 3.8.1 on top of that, following this guide: http://gethue.com/hadoop-hue-3-on-hdp-installation-tutorial/ the installation successful, and I'm able to open it in the browser and login. But then it shows me 3 misconfiguration:
The folder /home/user/hue and everything in it is owned by the hdfs user and belongs to the hdfs group. When first time loged in into HUE, I've created the ADMIN user. Do I need to add this Admin user to some group, if so, to which one? Also,Spark is installed as a part of Ambari package and up and runing. Do I need to install separately Livy Spark or it is again some configuration ?! So confused now ... I've double checked all config files, and it looks everything OK for me, any help where to look at or even direction where to dig. All steps in configuration guide followed and replaced with correct ports, hosts and addresses. Any ideas what is wrong and how to start HUE ? Thanks in advance.
Upvotes: 1
Views: 1198
Reputation: 7082
Is a warning that the / path of HDFS is not owned by the default 'hdfs'. If the owner is someone else and it is normal, you can update Hue here https://github.com/cloudera/hue/blob/master/desktop/conf.dist/hue.ini#L60
You should enter a random string of text for the 'secret_key' here https://github.com/cloudera/hue/blob/master/desktop/conf.dist/hue.ini#L21, that way nobody can hack your user passwords
This warning can be forgotten if you don't use the Spark App http://gethue.com/spark-notebook-and-livy-rest-job-server-improvements/. It requires the Livy Spark Server to be running
Upvotes: 2