Reputation: 489
I'm coding in a Databricks notebook (pyspark) and trying to get the filecount and filesizes of all subfolders in a specific Azure Data Lake gen2 mount path using dbutils.
I have code for it on a specific folder but I'm stuck on how to write the recursive part...
Upvotes: 1
Views: 13439
Reputation: 11
Get the list of the files from directory, Print and get the count with the below code.
def get_dir_content(ls_path):
dir_paths = dbutils.fs.ls(ls_path)
subdir_paths = [get_dir_content(p.path) for p in dir_paths if p.isDir() and p.path != ls_path]
flat_subdir_paths = [p for subdir in subdir_paths for p in subdir]
return list(map(lambda p: p.path, dir_paths)) + flat_subdir_paths
paths = get_dir_content('dbfs:/')
or
paths = get_dir_content('abfss://')
Below line prints the file names with path and number of files count at the end.
len([print(p) for p in paths])
if you want only count number of files use the below:
len([p for p in paths])
Upvotes: 1
Reputation: 6082
How about this?
def deep_ls(path: str):
"""List all files in base path recursively."""
for x in dbutils.fs.ls(path):
if x.path[-1] is not '/':
yield x
else:
for y in deep_ls(x.path):
yield y
Credits to
https://gist.github.com/Menziess/bfcbea6a309e0990e8c296ce23125059
Upvotes: 2