Steven
Steven

Reputation: 15318

output pig fs command to a file or a variable

I am trying to monitor the files in my HDFS system. Currently, the way I am doing it is very painful: fs -ls -R /river/entity/files;.

This command output the result in the console log file. And then I need to copy/paste the result into a file to use it... which is not efficient at all. Using PIG, is there a simple way to output the result of the command directly into a file?

EDIT: Thx for the answers but I haven't been clear enough. Sorry ! I cannot use terminal. I execute only a pig script and I want my result in HDFS. Is it possible to do that only on hadoop side?

Upvotes: 2

Views: 1286

Answers (2)

54l3d
54l3d

Reputation: 3973

Yes you can ! In fact you can execute any shell command in your Pig script like this :

%declare dummy `hdfs dfs -ls -R /river/entity/files | hdfs dfs -put - hdfs://nn.example.com/hadoop/myfile.log`

But you should consider that Pig statements does not executed in the order that appear in your script !

Upvotes: 1

Rijul
Rijul

Reputation: 1445

simply type in your terminal:

$hadoop fs -ls -R /river > your/path/to/file.txt

or write command in shell script example file.sh contains below code and then run your shell script:

hadoop fs -ls -R /river > your/path/to/file.txt

If you are using Pig Grunt shell or Pig script then use shell utility commands

example file_name.pig contains code fs -ls -R /river/entity/files

then run your pig script from terminal like PIG file_name.pig > your/path/to/file2.txt

Upvotes: 0

Related Questions