Reputation: 177
Failed to set permissions of path: \tmp\hadoop-MayPayne\mapred\staging\MayPayne2016979439\.staging to 0700
I'm getting this error when the MapReduce job executing, I was using hadoop 1.0.4, then I got to know it's a known issue and I tried this with the 1.2.0 but the issue still exists. Can I know a hadoop version that they have resolved this issue.
Thank you all in advance
Upvotes: 7
Views: 13414
Reputation: 23876
Downloading hadoop-core-0.20.2.jar and putting it on nutcher's lib directory resolved the problem for me
(In case of windows) If still not solved for you, try using this hadoop's patch
Upvotes: 5
Reputation: 41
set the below vm arguments
-Dhadoop.tmp.dir=<A directory location with write permission>
to override the default /tmp directory
Also using hadoop-core-0.20.2.jar (http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core/0.20.2) will solve the reported issue.
Upvotes: 4
Reputation: 746
I was getting the same exception while runing nutch-1.7 on windows 7.
bin/nutch crawl urls -dir crawl11 -depth 1 -topN 5
The following steps worked for me
Modify ${NUTCH_HOME}/conf/nutch-site.xml to enable the overriden implementation as shown below:
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>fs.file.impl</name>
<value>com.conga.services.hadoop.patch.HADOOP_7682.WinLocalFileSystem</value>
<description>Enables patch for issue HADOOP-7682 on Windows</description>
</property>
</configuration>
Run your job as usual (using Cygwin).
Upvotes: 11
Reputation: 177
I managed to solve this by changing the hadoop-core jar file little bit. Changed the error causing method in FileUtil.java in hadoop-core.jar file and recompiled and included in my eclipse project. Now the error is gone. I suggest every one of you to do that.
Upvotes: -4