London guy
London guy

Reputation: 28022

Viewing the number of blocks for a file in hadoop

How can I view how many blocks has a file been broken into, in a Hadoop file system?

Upvotes: 25

Views: 29472

Answers (4)

user1795667
user1795667

Reputation: 423

It is always a good idea to use hdfs instead of hadoop as 'hadoop' version is deprecated.

Here is the command with hdfs and to find the details on a file named 'test.txt' in the root, you would write

hdfs fsck /test.txt -files -blocks -locations

Upvotes: 3

yoga
yoga

Reputation: 1959

hadoop fsck filetopath

used the above commad in CDH 5. Got the below Error.

hadoop-hdfs/bin/hdfs: line 262: exec: : not found

Use the below command and it worked good

hdfs fsck filetopath

Upvotes: 4

user1261215
user1261215

Reputation:

We can use hadoop file system check command to know the blocks for the specific file.

Below is the command:

hadoop fsck [path] [options]

To view the blocks for the specific file :

hadoop fsck /path/to/file -files -blocks

Upvotes: 45

user1514507
user1514507

Reputation: 3

This should work..

hadoop fs -stat "%o" /path/to/file

Upvotes: -4

Related Questions