Majid Azimi
Majid Azimi

Reputation: 5745

yarn-daemon.sh writes to wrong log directory in HDP

I have install Hortonworks HDP version 2.2.4.2-2 on my laptop. I have started HDFS services such as name node, secondary name node, and all data nodes. I can browse HDFS through name node web interface. The problem is with resource manager. HDP companion files sets wrong value for HADOOP_LIBEXEC_DIR. It has been set to /usr/lib/hadoop/libexec which is wrong. The correct address is: /usr/hdp/2.2.4.2-2/hadoop/libexec. After I changed this parameter and started resource manager with this command (using yarn user):

/usr/hdp/current/hadoop-yarn-resourcemanager/sbin/yarn-daemon.sh --config /etc/hadoop/conf start resourcemanager

It shows error:

mkdir: cannot create directory `/var/log/hadoop-yarn': Permission denied
chown: cannot access `/var/log/hadoop-yarn/yarn': No such file or directory
mkdir: cannot create directory `/var/run/hadoop-yarn': Permission denied
starting resourcemanager, logging to /var/log/hadoop-yarn/yarn/yarn-yarn-resourcemanager-rm.hdp.local.out
/usr/hdp/current/hadoop-yarn-resourcemanager/sbin/yarn-daemon.sh: line 129: cd: /usr/lib/hadoop-yarn: No such file or directory
/usr/hdp/current/hadoop-yarn-resourcemanager/sbin/yarn-daemon.sh: line 131: /var/run/hadoop-yarn/yarn/yarn-yarn-resourcemanager.pid: No such file or directory
/usr/hdp/current/hadoop-yarn-resourcemanager/sbin/yarn-daemon.sh: line 130: /var/log/hadoop-yarn/yarn/yarn-yarn-resourcemanager-rm.hdp.local.out: No such file or directory
head: cannot open `/var/log/hadoop-yarn/yarn/yarn-yarn-resourcemanager-rm.hdp.local.out' for reading: No such file or directory
/usr/hdp/current/hadoop-yarn-resourcemanager/sbin/yarn-daemon.sh: line 135: /var/log/hadoop-yarn/yarn/yarn-yarn-resourcemanager-rm.hdp.local.out: No such file or directory
/usr/hdp/current/hadoop-yarn-resourcemanager/sbin/yarn-daemon.sh: line 136: /var/log/hadoop-yarn/yarn/yarn-yarn-resourcemanager-rm.hdp.local.out: No such file or directory

The problem is yarn-daemon.sh is using wrong log directory according to my environment variables:

[yarn@rm ~]$ echo ${YARN_LOCAL_DIR}
/hadoop/yarn/local
[yarn@rm ~]$ echo ${YARN_LOG_DIR}
/var/log/hadoop/yarn
[yarn@rm ~]$ echo ${YARN_LOCAL_LOG_DIR}
/hadoop/yarn/logs
[yarn@rm ~]$ echo ${YARN_PID_DIR}
/var/run/hadoop/yarn

Is this really a bug or I'm doing something wrong?

Upvotes: 3

Views: 1405

Answers (1)

David Kjerrumgaard
David Kjerrumgaard

Reputation: 1076

Majid,

The startup script you are using, i.e. /usr/hdp/current/hadoop-yarn-resourcemanager/sbin/yarn-daemon.sh sources the yarn-env.sh script that is located in the configuration directory you specified, in your case

--config /etc/hadoop/conf

Chances are that those environment variables are being overridden in that script. If you are using the companion files provided by Hortonwork's then they are indeed being changed to:

export YARN_LOG_DIR=/var/log/hadoop-yarn/$USER
export YARN_PID_DIR=/var/run/hadoop-yarn/$USER
export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec

You're best bet is to change these values in your /etc/hadoop/yarn-env.sh script to the values you desire and retry

Upvotes: 1

Related Questions