treeliked
treeliked

Reputation: 383

start hadoop failed with hadoop-functions.sh

I tried to start hadoop, but it failed with nothing started. Following the console log.

Mac:sbin lqs2$ sh start-all.sh
/Users/lqs2/Library/hadoop-3.1.1/libexec/hadoop-functions.sh: line 398: 
syntax error near unexpected token `<'
/Users/lqs2/Library/hadoop-3.1.1/libexec/hadoop-functions.sh: line 398: 
`done < <(for text in "${input[@]}"; do'
/Users/lqs2/Library/hadoop-3.1.1/libexec/hadoop-config.sh: line 70: 
hadoop_deprecate_envvar: command not found
/Users/lqs2/Library/hadoop-3.1.1/libexec/hadoop-config.sh: line 87: 
hadoop_bootstrap: command not found
/Users/lqs2/Library/hadoop-3.1.1/libexec/hadoop-config.sh: line 104: 
hadoop_parse_args: command not found
/Users/lqs2/Library/hadoop-3.1.1/libexec/hadoop-config.sh: line 105: 
shift: : numeric argument required
WARNING: Attempting to start all Apache Hadoop daemons as lqs2 in 10 
seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.

I have tried any ways to solve it but nothing woked. Even I reinstalled the latest version. But the error is the same. It almost drives me mad.

Any answer is helpful. Thanks.

Upvotes: 0

Views: 3482

Answers (2)

Debraj Ray
Debraj Ray

Reputation: 29

In my case, I was seeing this error in OSX after installing Hadoop using HomeBrew. The solution was to do a fresh install after downloading the Hadoop (3.2.1) binary directly from the official website. While installing, I had set HADOOP_HOME and JAVA_HOME environment variables.

A word of caution: I found that the issue can occur if the following environment variables are defined in hadoop-env.sh :

export HDFS_NAMENODE_USER="root"
export HDFS_DATANODE_USER="root"
export HDFS_SECONDARYNAMENODE_USER="root"
export YARN_RESOURCEMANAGER_USER="root"
export YARN_NODEMANAGER_USER="root"

I had initially added these variables while trying to fix the issue. Ultimately I removed them and the error disappeared.

Note, I executed all the Hadoop commands and scripts as non-root user, and also upgraded bash to version 5.0.17.

Upvotes: 0

OneCricketeer
OneCricketeer

Reputation: 191743

Hadoop scripts require bash, not sh

$ chmod +x start-all.sh
$ ./start-all.sh

Though I would suggest starting HDFS and YARN separately so that you can isolate other issues

You also need to downgrade Hadoop to at least the latest 2.7 release for Spark to work

Upvotes: 1

Related Questions