Reputation: 5050
I'm using the XMLReader to simply read a feed like below.
URLConnection urlConnection = url.openConnection();
XmlReader reader = new XmlReader(urlConnection);
When this is called I receive within 5 seconds an IOException "Timeout while fetching". So I tried to set the timeouts to the max. (10 sec) but still no luck and still an IOExeption in 5 sec.
urlConnection.setConnectTimeout(10000);
(the max is stated in documentation: http://code.google.com/intl/nl-NL/appengine/docs/java/urlfetch/overview.html)
Seems that the size of the feed is too large. When I call a smaller feed it works properly. Is there any workaround or solution for this? I need to be able to call larger feeds.
Upvotes: 8
Views: 4321
Reputation: 2110
The reason is:
If no data is available for the read timeout period, exception can be thrown. From the doc of Oracle
A SocketTimeoutException can be thrown when reading from the returned input stream if the read timeout expires before data is available for read.
By the way, ReadTimeout
is different with ConnectTimeout
, the read timeout is the timeout to get data from the host, see different connection timeout and read timeout
So as @systempuntoout answer, need to set read timeout.
Upvotes: 0
Reputation: 74094
You should use setReadTimeout
method that sets the read deadline:
urlConnection.setReadTimeout(10000); //10 Sec
You should be able to download larger feeds in 10 seconds.
If you still have problem, try to fiddle with this different approach.
Upvotes: 11