Reputation: 3945
I'm using central_logger to store logs from our Rails app in mongodb. When the mongo server went down recently our app started timing out on mongo inserts. How can I prevent Rails from timing out if the mongo server goes down?
Upvotes: 1
Views: 564
Reputation: 40277
The ruby driver supports timeouts like so
@conn = Connection.new("localhost", 27017, :pool_size => 5, :timeout => 5)
But the central_logger gem isn't using that. So you can either fork it to add that in there, or monkey-path the CentralLogger::MongoLogger.connect method
def connect
@mongo_connection ||= Mongo::Connection.new(@db_configuration['host'],
@db_configuration['port'],
:auto_reconnect => true).db(@db_configuration['database'])
if @db_configuration['username'] && @db_configuration['password']
# the driver stores credentials in case reconnection is required
@authenticated = @mongo_connection.authenticate(@db_configuration['username'],
@db_configuration['password'])
end
end
You would need to monkey-path in :timeout=>5 (or whatever) to the Mongo::Connection.new
I would bet the author of central-logger would like to have this in there, so a fork and pull request would likely be welcome.
Upvotes: 1
Reputation: 25767
Usually the database insert should be fast, so you could work with the ruby timeout:
require 'timeout'
Timeout::timeout(0.2) do
... write to log server
end
this code will timeout and continue after 200 milliseconds in any case.
Upvotes: 0
Reputation: 147324
You could use replica sets - so if the master goes down, it can failover automatically to one of the replicas.
Upvotes: 0