horatio1701d
horatio1701d

Reputation: 9169

java.io.FileNotFoundException Error on New EMR Cluster

Failing to understand how to resolve the below error I receive when running the hive cli on a new EMR Server. I've already confirmed that the user that the is being used has the permissions to write to /var/log/hive/user/hadoop

log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /var/log/hive/user/hadoop/hive.log (No such file or directory)
    at java.io.FileOutputStream.open(Native Method)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
    at java.io.FileOutputStream.<init>(FileOutputStream.java:142)
    at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
    at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
    at org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:223)
    at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
    at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
    at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
    at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
    at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:415)
    at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:127)
    at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:77)
    at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:58)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:586)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:570)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
log4j:ERROR Either File or DatePattern options are not set for appender [DRFA].

Upvotes: 2

Views: 2758

Answers (1)

Pham Van Vung
Pham Van Vung

Reputation: 343

It's due to the missing folder /var/log/hive/user/hadoop/.

So you now should type the following commands:

  1. Change the owner of this /var/log/hive/ to the current hadoop user using this command:

    sudo chown hadoop -R /var/log/hive
    
  2. Create the /var/log/hive/user/hadoop/ folder

    mkdir /var/log/hive/user
    mkdir /var/log/hive/user/hadoop
    
  3. Type hive again and things should be fine.

Upvotes: 5

Related Questions