Seen
Seen

Reputation: 4194

Server install hdfs client fail

I am getting the following errors for HDFS client installation on Ambari. Have reset the server several times but still cannot get it resolved. Any idea how to fix that?

stderr:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 120, in <module>
    HdfsClient().execute()
 File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
 File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 36, in install
    self.configure(env)
 File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 41, in configure
    hdfs()
 File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
 File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs.py", line 61, in hdfs
    group=params.user_group
 File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
    self.run_action(resource, action)
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
    provider_action()
 File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py", line 67, in action_create
    encoding = self.resource.encoding
 File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
    self.run_action(resource, action)
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
    provider_action()
 File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 87, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist

Upvotes: 2

Views: 4574

Answers (4)

ketankk
ketankk

Reputation: 2674

yum -y erase hdp-select

If you have done installation multiple times, some packages might not be cleaned.

To remove all HDP packages and start with fresh installation, erase hdp-select.

If this is not helping, remove all the versions from /usr/hdp delete this directory if it contains multiple versions of hdp

Remove all the installed packages like hadoop,hdfs,zookeeper etc.

yum remove zookeeper* hadoop* hdp* zookeeper*

Upvotes: 1

Stefan Papp
Stefan Papp

Reputation: 2255

I ran into the same problem: I was using HDP 2.3.2 on Centos 7.

The first problem: Some conf files point to the /etc//conf directory (same as they are supposed to) However, /etc//conf points back to the other conf directory which leads to an endless loop.

I was able to fix this problem by removing the /etc//conf symbolic links and creating directories

The second problem If you run the python scripts to clean up the installation and start over however, several directories do not get recreated, such as the hadoop-client directory. This leads to exact your error message. Also this cleanup script does not work out well as it does not clean several users and directories. You have to userdel and groupdel.

UPDATE: It seems it was a problem of HDP 2.3.2. In HDP 2.3.4, I did not run into that problem any more.

Upvotes: 0

Lincoln
Lincoln

Reputation: 181

This is a soft link that link to /etc/hadoop/conf

I run

python /usr/lib/python2.6/site-packages/ambari_agent/HostCleanup.py --silent --skip=users

After run it, it removes /etc/hadoop/conf

However, reinstall does not recreate it.

So you may have to create all conf files by yourself. Hope someone can patch it.

Upvotes: 1

melhior
melhior

Reputation: 139

Creating /usr/hdp/current/hadoop-client/conf on failing host should solve the problem.

Upvotes: -1

Related Questions