saad
saad

Reputation: 101

Downloading list of urls/files using loop - python

I need to download approximately 1000 file/url and it will be hard to download them manually.

I tried to put the urls in a list and loop through the list but it I think my code overwrite the previous files and keep only the last item in the list

Here is my code

#!/usr/bin/env python

import urllib3
http = urllib3.PoolManager()

urls = ["http://url1.nt.gz" , "http://url2.nt.gz" , "http://url3.nt.gz"]
N =1; // counter helps me to rename the downloaded files
print "downloading with urllib"
for url in urls
 r = http.request('GET',url)
 Name =str(N+1) // each time increment the counter by one 
 with open("file"+Name+".nt.gz", "wb") as fcont:
                fcont.write(r.data)

Any suggestions?

Upvotes: 2

Views: 6053

Answers (2)

Patrick Artner
Patrick Artner

Reputation: 51683

You do not increment the counter - you add 1 without saving it back to N

Add N += 1 after setting Name. You are missing a : after your for.

I am not quite sure where you have your 1000s of urls - I only see 3 in urls.

#!/usr/bin/env python

import urllib3
http = urllib3.PoolManager()

urls = ["http://url1.nt.gz" , "http://url2.nt.gz" , "http://url3.nt.gz"]
N =1  # counter helps me to rename the downloaded files
print "downloading with urllib"
for url in urls:
    r = http.request('GET',url)
    Name = str(N) 
    N += 1
    with open("file"+Name+".nt.gz", "wb") as fcont:
        fcont.write(r.data)

Upvotes: 2

Harry_Hirsch
Harry_Hirsch

Reputation: 28

print "downloading with urllib" for url in urls r = http.request('GET',url) Name += N

Upvotes: 0

Related Questions