Dave
Dave

Reputation: 19150

In RoR, how do I catch an exception if I get no response from a server?

I’m using Rails 4.2.3 and Nokogiri to get data from a web site. I want to perform an action when I don’t get any response from the server, so I have:

begin
  content = open(url).read
  if content.lstrip[0] == '<'
    doc = Nokogiri::HTML(content)
  else
    begin
      json = JSON.parse(content)
    rescue JSON::ParserError => e
      content
    end
  end
rescue Net::OpenTimeout => e
  attempts = attempts + 1
  if attempts <= max_attempts
    sleep(3)
    retry
  end
end

Note that this is different than getting a 500 from the server. I only want to retry when I get no response at all, either because I get no TCP connection or because the server fails to respond (or some other reason that causes me not to get any response). Is there a more generic way to take account of this situation other than how I have it? I feel like there are a lot of other exception types I’m not thinking of.

Upvotes: 3

Views: 2901

Answers (3)

Eugen Minciu
Eugen Minciu

Reputation: 131

When it comes to rescuing exceptions, you should aim to have a clear understanding of:

  • Which lines in your system can raise exceptions
  • What is going on under the hood when those lines of code run
  • What specific exceptions could be raised by the underlying code

In your code, the line that's fetching the content is also the one that could see network errors:

content = open(url).read

If you go to the documentation for the OpenURI module you'll see that it uses Net::HTTP & friends to get the content of arbitrary URIs.

Figuring out what Net::HTTP can raise is actually very complicated but, thankfully, others have already done this work for you. Thoughtbot's suspenders project has lists of common network errors that you can use. Notice that some of those errors have to do with different network conditions than what you had in mind, like the connection being reset. I think it's worth rescuing those as well, but feel free to trim the list down to your specific needs.

So here's what your code should look like (skipping the Nokogiri and JSON parts to simplify things a bit): require 'net/http' require 'open-uri'

HTTP_ERRORS = [
  EOFError,
  Errno::ECONNRESET,
  Errno::EINVAL,
  Net::HTTPBadResponse,
  Net::HTTPHeaderSyntaxError,
  Net::ProtocolError,
  Timeout::Error,
]
MAX_RETRIES = 3

attempts = 0

begin
  content = open(url).read
rescue *HTTP_ERRORS => e
  if attempts < MAX_RETRIES
    attempts += 1
    sleep(2)
    retry
  else
    raise e
  end
end

Upvotes: 5

Zoran Majstorovic
Zoran Majstorovic

Reputation: 1609

This is generic sample how you can define timeout durations for HTTP connection, and perform several retries in case of any error while fetching content (edited)

require 'open-uri'
require 'nokogiri'

url = "http://localhost:3000/r503"

openuri_params = {
  # set timeout durations for HTTP connection
  # default values for open_timeout and read_timeout is 60 seconds
  :open_timeout => 1,
  :read_timeout => 1,
}

attempt_count = 0
max_attempts  = 3
begin
  attempt_count += 1
  puts "attempt ##{attempt_count}"
  content = open(url, openuri_params).read
rescue OpenURI::HTTPError => e
  # it's 404, etc. (do nothing)
rescue SocketError, Net::ReadTimeout => e
  # server can't be reached or doesn't send any respones
  puts "error: #{e}"
  sleep 3
  retry if attempt_count < max_attempts
else
  # connection was successful,
  # content is fetched,
  # so here we can parse content with Nokogiri,
  # or call a helper method, etc.
  doc = Nokogiri::HTML(content)
  p doc
end

Upvotes: 5

spickermann
spickermann

Reputation: 106932

I would think about using a Timeout that raises an exception after a short period:

MAX_RESPONSE_TIME = 2 # seconds
begin
  content = nil # needs to be defined before the following block
  Timeout.timeout(MAX_RESPONSE_TIME) do  
    content = open(url).read
  end

  # parsing `content`
rescue Timeout::Error => e
  attempts += 1
  if attempts <= max_attempts
    sleep(3)
    retry
  end
end

Upvotes: 1

Related Questions