Reputation: 599
I am trying to solve this issue but unable to understand. The pig script in my Development machine ran on a 1.8 GB data file successfully. When I am trying to run it in server it is stating that it cannot find a local device to spill data spill0.out I have modified the pig.temp.Dir property in the pig.property file to point to a location having space..
error: org.apache.hadoop.util.DiskChecker$DiskErrorException: Could not find any valid local directory for output/spill0.out
So how to find out where pig is spilling out the data and can we change the pig spill directory location as well somehow.
I using pig in local mode.
Any ideas or suggestions or workarounds will be of great help.
Thanks..
Upvotes: 1
Views: 217
Reputation: 4310
I had no luck with these answers, Pig (version 0.15.0) was still writing pigbag*
files to /tmp
dir so I just renamed my /tmp
dir and created a symbolic link to the desired location like this:
sudo -s #change to root
cd /
mv tmp tmp_local
ln -s /desired/new/tmp/location tmp
chmod 1777 tmp
mv tmp_local/* tmp
Make sure there are no active applications writing to tmp folder at the time of running these commands.
Upvotes: 0
Reputation: 599
I found an answer.
We need to put the follwing to the $PIG_HOME/conf/pig.properties file
mapreduce.jobtracker.staging.root.dir
mapred.local.dir
pig.temp.dir
and then test.
This has helped me solve the problem.
Upvotes: 1
Reputation: 921
This is not a problem with Pig. I'm not using Pig and I also have exactly the same error. The problem seems to be more related to Hadoop. I also use it in local mode. I'm using Hadoop 2.6.0
Upvotes: 0