Reputation: 478
I have written a code to filter, group, and sort my large data files. I have multiple text files I have to analyze. I know I can copy the code and run it with new data but I was wondering if there was a way to put this in a for loop that would open the text files one by one and run and store the results. I use the following to load all my text files. In the next steps, I select columns and filter them to find the desired values. But at the moment it only reads one file. I want to obtain results from all data files.
Samples <- Sys.glob("*.csv")
for (filename in Samples) {
try <- read.csv(filename, sep = ",", header = FALSE)
shear <- data.frame(try[,5],try[,8],try[,12])
lane <- shear[which(shear$Load == "LL-1"),]
Ext <- subset(lane, Girder %in% c("Left Ext","Right Ext"))
Max.Ext <- max(Ext$Shear)
}
Upvotes: 0
Views: 621
Reputation: 388807
You can put everything that you want to apply to each file in a function :
apply_fun <- function(filename) {
try <- read.csv(filename, sep = ",", header = FALSE)
shear <- data.frame(try[,5],try[,8],try[,12])
lane <- shear[which(shear$Load == "LL-1"),]
Ext <- subset(lane, Girder %in% c("Left Ext","Right Ext"))
return(max(Ext$Shear, na.rm = TRUE))
}
and here it seems we want only one number (max
) from each file, we can use sapply
to apply the function to each file.
Samples <- Sys.glob("*.csv")
sapply(Samples, apply_fun)
Upvotes: 1