Chris Nguyen
Chris Nguyen

Reputation: 160

Efficiently read and write large file

I have a simple program that reads data from a website then writes it to a file using urllib. But the file is several hundred megabytes in size and Python keeps raising Out of Memory Error

import urllib
content = urllib.urlopen ( LINK )
f = file ("Something.mp4", "wb")
f.write ( content.read () )

Then when I download and write to file, it raise the Memory Error. I am thinking of breaking up the content into blocks and write each individual blocks but I only know how to read a file line by line rather than block by block.

Update: Better yet, is there a way I can just read a block of data from the server, write that, and start downloading back off at X? Cause I think the memory error is coming from the returned data of the urllib file instance!

Question: How to download data, and write it efficiently to a file with out memory deficiency? ...and/or what is a Pythonic way of achieving this?

Bonus: How can I find the actual memory limits/parameters that my system can handle? Example: Since I have 4 Gigs of RAM, does that mean I can use 4 gigs of memory? ...What/When will Python cut me off at after exceeding a certain memory usage?

Specs: Windows 8, 64 bit, 4GB RAM, 1.3 GHZ CPU. Python 2.7

Upvotes: 0

Views: 244

Answers (1)

Suresh Jaganathan
Suresh Jaganathan

Reputation: 537

Try using requests to download file as block

filename="something.mp4"
r = requests.get(LINK, stream=True)
with open(local_filename, 'wb') as f:
  for block in r.iter_content(block_size=1024): 
     if block:
        f.write(block)

Refer http://docs.python-requests.org/en/latest/user/advanced/#body-content-workflow

Upvotes: 1

Related Questions