Mohit Rane
Mohit Rane

Reputation: 279

Hadoop fs -ls command to get 1st 10 files

I have a hadoop commnd like this :

hadoop fs -ls /user/hive/warehouse/mashery_db.db/agg_per_mapi_stats_five_minutes/ | sort | awk  '{ if (index($8, ".hive") == 0 && $6 <= "'"2016-02-10"'" && $7 <= "'"05:00"'") print $8 }'

I want to get 1st 10 values from it,instead of getting all the files in the directory.

Upvotes: 2

Views: 6930

Answers (2)

Devendra Bhat
Devendra Bhat

Reputation: 1219

Just use

hadoop fs -ls /path/of/hdfs/location/ | head -10

that will work.

Upvotes: 5

OneCricketeer
OneCricketeer

Reputation: 191874

Add another pipe on the end to head -10

hadoop fs -ls /stuff | sort | awk whatever | head -10

Upvotes: 3

Related Questions