onyourmark
onyourmark

Reputation: 75

implementing tryCatch R

Trying to use tryCatch. What I want is to run through a list of urls that I have stored in page1URLs and if there is a problem with one of them (using readHTMLTable() )I want a record of which ones and then I want the code to go on to the next url without crashing.

I think I don't have the right idea here at all. Can anyone suggest how I can do this?

Here is the beginning of the code:

baddy <- rep(NA,10,000)
badURLs <- function(url) { baddy=c(baddy,url) }

writeURLsToCsvExtrema(38.361042, 35.465144, 141.410522, 139.564819)

writeURLsToCsvExtrema <- function(maxlat, minlat, maxlong, minlong) {

urlsFuku <- page1URLs
allFuku <- data.frame() # need to initialize it with column names

for (url in urlsFuku) {

    tryCatch(temp.tables=readHTMLTable(url), finally=badURLs(url))

    temp.df <- temp.tables[[3]]
    lastrow <- nrow(temp.df)
    temp.df <- temp.df[-c(lastrow-1,lastrow),] 

}

Upvotes: 0

Views: 150

Answers (1)

Martin Morgan
Martin Morgan

Reputation: 46876

One general approach is to write a function that fully processes one URL, returning either the computed value or NULL to indicate failure

FUN = function(url) {
    tryCatch({
        xx <- readHTMLTable(url)  ## will sometimes fail, invoking 'error' below
        ## more calculations
        xx  ## final value
    }, error=function(err) {
        ## what to do on error? could return conditionMessage(err) or other...
        NULL
    })
}

and then use this, e.g., with a named vector

urls <- c("http://cran.r-project.org", "http://stackoverflow.com", 
          "http://foo.bar")
names(urls) <- urls           # add names to urls, so 'result' elements are named
result <- lapply(urls, FUN)

These guys failed (returned NULL)

> names(result)[sapply(result, is.null)]
[1] "http://foo.bar"

And these are the results for further processing

final <- Filter(Negate(is.null), result)

Upvotes: 2

Related Questions