ramdaz
ramdaz

Reputation: 1791

What is the fastest way to check whether a folder size is greater than a specific size?

What will be the fastest way to check whether a folder size is beyond a specific size say 10 MB, 1 Gb , 10 GB etc, without actually calculating the folder size. Something like quota. A Pythonic solution will be great, but standard UNIX utilities also welcome

Upvotes: 2

Views: 2277

Answers (4)

jfs
jfs

Reputation: 414615

import os
from os.path import join, getsize

def getsize_limited(directory, limit):
        total_size = 0
        for root, dirs, files in os.walk(directory, topdown=False):
            for name in files:
                total_size += getsize(join(root, name))
                if total_size > limit:
                   return limit, False
        return total_size, True

Example:

size, within_limit = getsize_limited(os.getcwd(), limit=10**6)

Upvotes: 4

ghostdog74
ghostdog74

Reputation: 342659

you can use du -sb, which still have to calculate folder size .eg

threshold=1024000 #bytes
path="/your/path"
s=$(du -sb "$path")
set -- $s
size=$1
if [ "$size" -gt $threshold ];then
    echo "size of $path greater than $threshold"
fi

Upvotes: 2

SpliFF
SpliFF

Reputation: 39004

I'd have to say it's impossible. I don't believe any filesystems cache folder sizes. Whatever you do is going to have to walk the tree in some fashion or another. Using du is probably the fastest method since it's all going to be happening in C.

If you know the maximum filesize expected or supported you could perhaps optimise a little by counting the enties in each folder rather than the sizes and short-cutting in the case where there aren't enough files to meet the limit.

Upvotes: 2

YOU
YOU

Reputation: 123881

Folder size is still the total size of the folder contents.

You may try to call du -s foldername from python

Upvotes: 1

Related Questions