ink
ink

Reputation: 317

ios: memory usage when reading a big file

I need to import some files in my iOS app and forward the data to another layer (via MessagePack).

The imported file can be quite big, so I can't load it entirely in memory and I must read and forward it by chunks. To do so, I'm using a NSInputStream, and every time it gives me a bit of data, I pack it and forward it. But strangely, doing so doesn't change the memory usage of the process, as if the chunks of data were note released immediately.

The process could be simulated by this piece of code, where I create random data and extract its bytes (that's the main part of the MessagePack packing, that uses the memory):

for(int i = 0; i < 200; i++) {
    NSData *theData = [self generateRandomData];
    const char *buf = ((NSData*)theData).bytes;
}

The memory usage skyrockets and reach ~450MB (the generateRandomData creates a 2MB piece of data), and then decreases after the end of the for loop to its regular level.

I would have thought that the buf variable should be released after every loop iteration, and therefore the memory usage never go very high.

Why isn't the case? Am I missing something?

How could I import a big file like this then? I was thinking about making a short break during the process (every 100MB imported or so) to let the memory usage decrease, but it doesn't seem ideal to me.

Upvotes: 1

Views: 675

Answers (1)

mukul
mukul

Reputation: 382

I also face the same problem of memory utilisation in my application as it reaches to 600 mb ,then i release the memory manually by using an autorelease pool .

    for(int i = 0; i < 200; i++) 
            {
            NSData *theData = [self generateRandomData];

@autoreleasepool {       
     const char *buf = ((NSData*)theData).bytes;
             }    

         }

Create an autorelease pool and then check your memory utilisation .Hope this Helps you.

Upvotes: 2

Related Questions