nmnm
nmnm

Reputation: 89

urllib freeze if url is too big !

ok im trying to open a url using urllib but the problem is that the file is too big, so when i open the url python freezes, im also using wxpython which also freezes when i open the url my cpu goes to almost 100% when the url is opened

any solutions ? is there a way i can open the url in chunks and maybe have a time.sleep(0.5) in there so it does not freeze ? this is my code :

f = open("hello.txt",'wb')
datatowrite = urllib.urlopen(link).read()
f.write(datatowrite)
f.close()

Thanks

Upvotes: 1

Views: 375

Answers (1)

Paul McMillan
Paul McMillan

Reputation: 20117

You want to split the download into a separate thread, so your UI thread continues to work while the download thread does the work separately. That way you don't get the "freeze" while the download happens.

Read more about threading here:

http://docs.python.org/library/threading.html

Alternatively, you could use the system to download the file outside of python using curl or wget.

Upvotes: 1

Related Questions