Reputation: 974
My ISP forces a disconnect on my connection once a day. If that occurs during a download, the download just pauses forvever, without throwing an exception whatsoever.
Right now I could only think of threading the download, then checking for a maximum time or reading the filesize evey now and then to check if it's still growing.
The functions I'm using right now:
def download_with_progress(url,localFileName,overwrite=False):
if not os.path.exists(localFileName) or overwrite == True:
urlretrieve(url, localFileName, reporthook=print_progress)
sys.stdout.write("\r") #remove previously printed percent sign
sys.stdout.flush()
def print_progress(count, blockSize, totalSize):
total_MB = (totalSize) / (1000 * 1000)
current_MB = (blockSize * count) / (1000 * 1000)
percent = int(count*blockSize*100/totalSize)
sys.stdout.write("\r%d%% (%d/%d MB)" % (percent, current_MB, total_MB))
sys.stdout.flush()
Since I'm already here: is total_MB
or current_MB
correctly calculated that way? Or should I divide by 1024 * 1024
? I'm displaying them as rounded int, so it's not really a problem.
Upvotes: 0
Views: 129
Reputation: 1124558
Instead of using urllib2
, use the requests
library; it sets the TCP Keep-Alive option letting you detect ISP disconnects.
You may need to set additional socket options; the urllib3
library uses select
to detect if sockets are still available, and sets timeouts, but adding an explicit KEEPALIVE option to the socket should make detection a little smoother still:
import httplib
import socket
orig_connect = httplib.HTTPConnection.connect
def new_connect(self):
orig_connect(self)
self.sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
httplib.HTTPConnection.connect = new_connect
Upvotes: 2