frazman
frazman

Reputation: 33293

Copying sample of file from hdfs to local fs?

Ok,

A very stupid question...

I have a large file in hdfs

/user/input/foo.txt

I want to copy first 100 lines from this location to local filesystem...

And the data is very sensitive so i am bit hesistant to experiment.

What is the right way to copy sample data from hdfs to local fs.

Upvotes: 0

Views: 3826

Answers (3)

Jaime
Jaime

Reputation: 153

Here is an easy way that ensures victory:

hdfs dfs -copyToLocal /user/input/foo.txt /path/to/local/file | head -100

Upvotes: 2

Tariq
Tariq

Reputation: 34184

If the file is not compressed:

bin/hadoop fs -cat /path/to/file |head -100 > /path/to/local/file

If the file is compressed:

bin/hadoop fs -text /path/to/file |head -100 > /path/to/local/file

Upvotes: 4

jgauld
jgauld

Reputation: 649

You could use the head program to extract the few lines from the beginning of a file, eg:

$ head /user/input/foo.txt -n100

(where n determines the number of lines to extract), and redirect output to the file of your choice:

$ head /user/input/foo.txt -n100 > /path/to/you/output/file

Upvotes: 1

Related Questions