Reputation: 4229
I have thousands of *csv files in multiple folders ../t1/*.csv
,../t2/*.csv
,../t3/*.csv
...etc.
I can upload the files from multiple folders as following:
filenames <- list.files(c("C:/Example/t1","C:/Example/t2"), pattern="*.csv", full.names=TRUE)
list.df <- lapply(filenames, read.csv)
However I have to type in all the directories C:/Example/t1
, C:/Example/t2
etc. How to read all data (as list of data.frames) with one main directory somethign like: C:/Example/*
?
Upvotes: 0
Views: 479
Reputation: 94192
Using list.files
with recursive=TRUE
will search all folders under the first argument for matching files:
> list.files("./",recursive=TRUE)
[1] "a/a1.csv" "a/a2.csv" "a/notme.txt" "b/b1.csv" "d/e/e1.csv"
That's all the files under my current directory, if I only want CSVs:
> list.files("./",recursive=TRUE,pattern="*.csv")
[1] "a/a1.csv" "a/a2.csv" "b/b1.csv" "d/e/e1.csv"
Notice how it looks in the second-level d/e/
folder?
If you only want to go to a single, specific depth, try Sys.glob
- these patterns match folders and files and these examples work from the current directory:
Only first level:
> Sys.glob("*/*.csv")
[1] "a/a1.csv" "a/a2.csv" "b/b1.csv"
Only second level:
> Sys.glob("*/*/*.csv")
[1] "d/e/e1.csv"
Upvotes: 4