Reputation: 79
I am having trouble downloading multiple network files from an online directory. I am using a virtual Linux environment (Lubuntu) over VMware. My aim is to access a subfolder and download all the .gz files it contains into a new local directory that is different from the home directory. I tried multiple solutions and this is the closest I got.
import os
from urllib2 import urlopen, URLError, HTTPError
def dlfile(url):
# Open the url
try:
f = urlopen(url)
print "downloading " + url
# Open our local file for writing
with open(os.path.basename(url), "wb") as local_file:
local_file.write(f.read())
#handle errors
except HTTPError, e:
print "HTTP Error:", e.code, url
except URLError, e:
print "URL Error:", e.reason, url
def main():
# Iterate over image ranges
for index in range(100, 250,5):
url = ("http://data.ris.ripe.net/rrc00/2016.01/updates20160128.0%d.gz"
%(index))
dlfile(url)
if __name__ == '__main__':
main()
The online directory needs no authentication, a link can be found here.
I tried string manipulation and using a loop over the filenames, but it gave me the following error:
HTTP Error: 404 http://data.ris.ripe.net/rrc00/2016.01/updates20160128.0245.gz
Upvotes: 0
Views: 300
Reputation: 644
Look at the url
Good url: http://data.ris.ripe.net/rrc00/2016.01/updates.20160128.0245.gz
Bad url (your code): http://data.ris.ripe.net/rrc00/2016.01/updates20160128.0245.gz
A dot between updates and 2016 is missing
Upvotes: 1