Reputation: 3811
I have a script to scan set of folders to get sizes; and display this info to browser. This script calls 'du' and parses output.
The question is about performance. How fast is it? for example if directory size in 4 GB and 100.000 files are in
p.s. I understand that these metrics depend on hardware, but if you have similar experience with scanning large directories for sizes - could you share your experience?
thank you
Upvotes: 2
Views: 679
Reputation: 91932
It heavily depends on the file system. It's usually pretty slow on ext3, and on most other file systems as well if there's lots of subdirectories.
I don't think there's any other way to do it in realtime, however. You can pre-scan the directory and cache the result in a file or to a database, but in that case you will increase the complexity pretty much.
Upvotes: 2