Abhinay
Abhinay

Reputation: 635

How to copy data from HDFS to Local FS using oozie workflow?

I have written the java code using FileUtil.CopyMerge(…) to merge the files into single file.

Below is my oozie java action :

<main-class>Merging</main-class
<arg>${nameNode}/user/abhime01/haadoop/Merge/merge_output</arg>
<arg>file:///home/abhi01/yoooize.txt</arg>

In the above workflow if I try the second arg(destination) as a path in hdfs I am able to merge the data and store in hdfs.

But if I give it as path in local file system(as in above snippet) I am getting the following error :

Mkdirs failed to create file:/home/abhime01 (exists=false, cwd=file:/CDH/sdu1/yarn/nm/usercache/abhime01/appcache/application_1440579785423_1755/container_e27_1440579785423_1755_01_000001)

Can anyone please suggest me how to merge files and store it into local FS using ooize.

PS: Java code works fine if it is run without oozie, problem exists while running through oozie.

Upvotes: 1

Views: 1551

Answers (1)

suresiva
suresiva

Reputation: 3173

When you execute actions using Oozie , then the respective actions will get executed by a container in any of random node in the cluster.

So the local file path that you mention will be no longer valid for Oozie since its scope is on the cluster.

So you should not use local file paths as inputs to the any type of Oozie actions , try to bring all dependent inputs and expect the outputs only on HDFS while you use Oozie.

Upvotes: 1

Related Questions