Reputation: 91
I am running a simulation study and need to process and save the results from several text files. I have the data organized in such a way where there are sub directories and within each sub directory, I need to process and get individual results for 1000 data files. This is very easy to do in SAS using macros. However, I am new to R and cannot figure out how to do such. Below is what I am trying to accomplish.
DATA Folder-> DC1 -> DC1R1.txt ... DC1R1000.txt
DC2 -> DC2R1.txt ... DC2R1000.txt
Any help would be greatly appreciated!
Upvotes: 9
Views: 21185
Reputation: 828
filenames <- list.files("path/to/files", recursive=TRUE)
This will give you all the files residing under one folder and sub folders under it.
Upvotes: 5
Reputation: 40803
I'm not near a computer with R right now, but read the help for file-related functions:
The dir
function will list the files and directories. It has a recursive argument.
list.files
is an alias for dir
. The file.info
function will tell you (among other things) if a path is a directory and file.path
will combine path parts.
The basename
and dirname
functions might also be useful.
Note that all these functions are vectorized.
EDIT Now at a computer, so here's an example:
# Make a function to process each file
processFile <- function(f) {
df <- read.csv(f)
# ...and do stuff...
file.info(f)$size # dummy result
}
# Find all .csv files
files <- dir("/foo/bar/", recursive=TRUE, full.names=TRUE, pattern="\\.csv$")
# Apply the function to all files.
result <- sapply(files, processFile)
Upvotes: 13
Reputation: 55695
If you need to run the same analysis on each of the files, then you can access them in one shot using list.files(recursive = T)
. This is assuming that you have already set your working directory to Data Folder
. The recursive = T
lists all files within subdirectories as well.
Upvotes: 8