Reputation: 443
I'm trying to get a csv file from a url but it seems to be timing out after one minute. The csv file is being created at the time of the request so it needs a little more than a minute. I tried to increase the timeout but it didn't work, it still fails after a minute.
I'm using url
and read.csv
as follows:
# Start the timer
ptm <- proc.time()
urlCSV <- getURL("http://someurl.com/getcsv", timeout = 200)
txtCSV <- textConnection(urlCSV)
csvFile <- read.csv(txtCSV)
close(txtCSV)
# Stop the timer
proc.time() - ptm
resulting log:
Error in open.connection(file, "rt") : cannot open the connection
In addition: Warning message:
In open.connection(file, "rt") :
cannot open: HTTP status was '500 Internal Server Error'
user system elapsed
0.225 0.353 60.445
It keeps failing when it reach one minute, what could be the problem? Or how do I increase the timeout?
I tried the url in a browser and it works fine but it takes more than a minute to load the csv
Upvotes: 6
Views: 7283
Reputation: 103898
You're getting a 500 error from the server, which suggests the time out is happening there, and is therefor outside your control (unless you can ask for less data)
Upvotes: 2
Reputation: 30425
libcurl has a CONNECTTIMEOUT setting http://curl.haxx.se/libcurl/c/CURLOPT_CONNECTTIMEOUT.html.
You can set this in RCurl
:
library(RCurl)
> getCurlOptionsConstants()[["connecttimeout"]]
[1] 78
myOpts <- curlOptions(connecttimeout = 200)
urlCSV <- getURL("http://someurl.com/getcsv", .opts = myOpts)
Upvotes: 5