Munichong
Munichong

Reputation: 4031

Hadoop: start-dfs.sh permission denied

I am installing Hadoop on my laptop. SSH works fine, but I cannot start hadoop.

munichong@GrindPad:~$ ssh localhost
Welcome to Ubuntu 12.10 (GNU/Linux 3.5.0-25-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

0 packages can be updated.
0 updates are security updates.

Last login: Mon Mar  4 00:01:36 2013 from localhost

munichong@GrindPad:~$ /usr/sbin/start-dfs.sh
chown: changing ownership of `/var/log/hadoop/root': Operation not permitted
starting namenode, logging to /var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out
/usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-munichong-namenode.pid: Permission denied
usr/sbin/hadoop-daemon.sh: line 135: /var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out: Permission denied
head: cannot open `/var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out' for reading: No such file or directory
localhost: chown: changing ownership of `/var/log/hadoop/root': Operation not permitted
localhost: starting datanode, logging to /var/log/hadoop/root/hadoop-munichong-datanode-GrindPad.out
localhost: /usr/sbin/hadoop-daemon.sh: line 135: /var/log/hadoop/root/hadoop-munichong-datanode-GrindPad.out: Permission denied
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-munichong-datanode.pid: Permission denied
localhost: head: cannot open `/var/log/hadoop/root/hadoop-munichong-datanode-GrindPad.out' for reading: No such file or directory
localhost: chown: changing ownership of `/var/log/hadoop/root': Operation not permitted
localhost: starting secondarynamenode, logging to /var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-munichong-secondarynamenode.pid: Permission denied
localhost: /usr/sbin/hadoop-daemon.sh: line 135: /var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out: Permission denied
localhost: head: cannot open `/var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out' for reading: No such file or directory

munichong@GrindPad:~$ sudo /usr/sbin/start-dfs.sh
[sudo] password for munichong: 
starting namenode, logging to /var/log/hadoop/root/hadoop-root-namenode-GrindPad.out
localhost: Permission denied (publickey,password).
localhost: Permission denied (publickey,password).

I used "sudo". But the permission is still denied.

Is there anyone can help me?

Thanks in advance!

Upvotes: 27

Views: 74760

Answers (11)

GerovanMi
GerovanMi

Reputation: 11

I received a similar error message, which led me to this post.
The full error message was:

localhost: rcmd: socket: Permission denied

As described in this post, for this case you need to create the file /etc/pdsh/rcmd_defaul with ssh as its content.

echo "ssh" > /etc/pdsh/rcmd_default

Upvotes: 1

arunava maiti
arunava maiti

Reputation: 391

I faced same problem, so tried to connect SSH and got statement like "not found," so I went to the ssh location to debug by the following steps:

cd ~/.ssh

ssh-keygen -t rsa -p""

cat id_rsa.pub >> authorized_keys

... then it worked ...

Upvotes: 23

user5099519
user5099519

Reputation:

Had to do this, like everyone above:

cd ~/.ssh

ssh-keygen -t rsa -p""

cat id_rsa.pub >> authorized_keys

But this was the key:

chmod 400 ~/.ssh/id_rsa

Upvotes: 0

Divyang Shah
Divyang Shah

Reputation: 1668

try to change the permissions of location where hdfs namenode & datanode are stored.

location mentioned in hdfs-site.xml

that location should have permission of 755 that is. -rwxr-xr-x with user by whom you are running hadoop.

Also set same permission for log location.

Hope it will help!

Upvotes: 0

Mahesh Guru
Mahesh Guru

Reputation: 11

R hadoop installation for permission denied issue, below command works for start-all.sh

sudo chown -R hadoop /usr/local/hadoop/ 

Upvotes: 0

Jack Smith
Jack Smith

Reputation: 1

I solved it by setting permissions of all files to 777:

sudo chmod 777 /usr/local/hadoop-2.7.6/* -R

Upvotes: -3

Nealesh
Nealesh

Reputation: 648

I was stuck at the same issue for last couple of hours but finally solved it. I had the hadoop installation extracted by same user as one I am using to run hadoop. So user privilege is not issue.
My cofiguration is like this: Ubuntu linux machine on Google Cloud.

Hadoop installation /home/ Hadoop data directory /var/lib/hadoop and the directory access bits are 777 so anybody can access. I did ssh into the remote machine made changes to the config files and executed start-dfs.sh, then it gave me "Permission denied (Public key)" So here is the solution: In the same ssh terminal:

  1. ssh-keygen

2.It will ask for folder location where it will copy the keys, I entered /home/hadoop/.ssh/id_rsa

3.it will ask for pass phrase, keep it empty for simplicity.

4.cat /home/hadoop/.ssh/id_rsa.pub >> .ssh/authorized_keys (To copy the newly generated public key to auth file in your users home/.ssh directory)

  1. ssh localhost

  2. start-dfs.sh (Now it should work!)

Upvotes: 42

Muthukrishnan
Muthukrishnan

Reputation: 2197

You are trying to ssh to your own machine (localhost) and missing the authorized_keys file which allows login.

This file in SSH specifies the SSH keys that can be used for logging into the user account for which the file is configured.

Follow the below two steps to configure it correctly.

Generate new keygen with the below command in terminal:

ssh-keygen

Press enter so as to retain the default name id_rsa.pub

Now register the generated key file:

cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

Upvotes: 5

bolor
bolor

Reputation: 1

I think the problem is that root and user's ssh connection. here is my copy https://askubuntu.com/questions/497895/permission-denied-for-rootlocalhost-for-ssh-connection Solved my case.

By default, the SSH server denies password-based login for root. In /etc/ssh/sshd_config,

change: PermitRootLogin without-password to PermitRootLogin yes

And restart SSH: sudo service ssh restart

Or, you can use SSH keys. If you don't have one, create one using ssh-keygen (stick to the default for the key, and skip the password if you feel like it). Then do sudo -s (or whatever your preferred method of becoming root is), and add an SSH key to /root/.ssh/authorized_keys:

cat /home/user/.ssh/id_rsa.pub >> /root/.ssh/authorized_keys

Upvotes: 0

koalagreener
koalagreener

Reputation: 121

Well I am now facing with this problem either, and before I got in this question, I use the method below.

  1. sudo -s -H

use this code to login as root user

  1. ssh localhost

login by using ssh (if you are just trying to use single node mode)

  1. ./sbin/start-dfs.sh

./sbin/start-yarn.sh

"cd" to your Hadoop installation route then print that code to start the HDFS&MapRedude , then you won't face the permittion problem again.

I guess the cause of this problem :

I use the root user to init the Hadoop environment, so the several folders were create by root user, so that When I now using my own account like 'Jake', I don't have permit to start the service(During that time the system need to access the LOGS )

enter image description here

Upvotes: 1

Milind Jindal
Milind Jindal

Reputation: 176

Try to change the ownership of the folder: /var/log/hadoop/root to the user: munichong. As on all systems the LOGS directory needs to be edited by hadoop. So it requires the permission to edit the LOG folder and its contents.

sudo will not work in this case as this requires to have the permission of changing the folder contents even after this script finishes its work i.e to start HADOOP services in the background.

Upvotes: 5

Related Questions