user360321
user360321

Reputation: 177

Failed to set permissions of path: \tmp

Failed to set permissions of path: \tmp\hadoop-MayPayne\mapred\staging\MayPayne2016979439\.staging to 0700 

I'm getting this error when the MapReduce job executing, I was using hadoop 1.0.4, then I got to know it's a known issue and I tried this with the 1.2.0 but the issue still exists. Can I know a hadoop version that they have resolved this issue.

Thank you all in advance

Upvotes: 7

Views: 13414

Answers (4)

Muhammad Soliman
Muhammad Soliman

Reputation: 23876

Downloading hadoop-core-0.20.2.jar and putting it on nutcher's lib directory resolved the problem for me

(In case of windows) If still not solved for you, try using this hadoop's patch

Upvotes: 5

Suraj Mathew
Suraj Mathew

Reputation: 41

set the below vm arguments

-Dhadoop.tmp.dir=<A directory location with write permission>

to override the default /tmp directory

Also using hadoop-core-0.20.2.jar (http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core/0.20.2) will solve the reported issue.

Upvotes: 4

VirtualLogic
VirtualLogic

Reputation: 746

I was getting the same exception while runing nutch-1.7 on windows 7.

bin/nutch crawl urls -dir crawl11 -depth 1 -topN 5

The following steps worked for me

  1. Download the pre-built JAR, patch-hadoop_7682-1.0.x-win.jar, from theDownload section. You may get the steps for hadoop.
  2. Copy patch-hadoop_7682-1.0.x-win.jar to the ${NUTCH_HOME}/lib directory
  3. Modify ${NUTCH_HOME}/conf/nutch-site.xml to enable the overriden implementation as shown below:

    <?xml version="1.0"?>
    <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
    <!-- Put site-specific property overrides in this file. -->
    <configuration>
        <property>
            <name>fs.file.impl</name>
            <value>com.conga.services.hadoop.patch.HADOOP_7682.WinLocalFileSystem</value>
            <description>Enables patch for issue HADOOP-7682 on Windows</description>
        </property>
     </configuration>
    
  4. Run your job as usual (using Cygwin).

Upvotes: 11

user360321
user360321

Reputation: 177

I managed to solve this by changing the hadoop-core jar file little bit. Changed the error causing method in FileUtil.java in hadoop-core.jar file and recompiled and included in my eclipse project. Now the error is gone. I suggest every one of you to do that.

Upvotes: -4

Related Questions