Reputation: 650
There is a very large picture that could not load into memory once. Because it may cause out of memory exception. I need to zoom this picture to small size. So what should I do?
The simple thought is open an inputstream, and process a buffer size at a time. But the zoom algorithm?
Upvotes: 4
Views: 238
Reputation: 21932
If you can access the picture row-by-row (e.g. it's a bitmap), the simplest thing you could do is just downsample it, e.g. only read every nth pixel of every nth row.
// n is an integer that is the downsampling factor
// width, height are the width and height of the original image, in pixels
// down is a new image that is (height/n * width/n) pixels in size
for (y = 0; y < height; y += n) {
row = ... // read row y from original image into a buffer
for (x = 0; x < width; x += n) {
down[y/n, x/n] = row[x]; // image[row,col] -- shorthand for accessing a pixel
}
}
This is a quick-and-dirty way that can quickly and cheaply resize the original image without ever loading the whole thing into memory. Unfortunately, it also introduces aliasing in the output image (down). Dealing with aliasing would require performing interpolation -- still possible using the above row-by-row approach, but is a bit more involved.
If you can't easily access the image row-by-row, e.g. it's a JPEG, which encodes data in 8x8 blocks, you can still do something similar to the approach I described above. You would simply read a row of blocks instead of a row of pixels -- the remainder of the algorithm would work the same. Furthermore, if you're downsampling by a factor of 8, then it's really easy with JPEG -- you just take the DC coefficient of each block. Downsampling by factors that are multiples of 8 is also possible using this approach.
I've glossed over many other details (such as color channels, pixel stride, etc), but it should be enough to get you started.
Upvotes: 3
Reputation: 563
There are a lot of different resizing algorithms which offer varying level of quality with the trade off being cpu time.
I believe with any of these you should be able to process a massive file in chunks relatively easily, however, you should probably try existing tools to see whether they can already just handle the massive file anyway.
Gd graphics library allows you to define how much working memory it can use I believe so it obviously already has logic for processing files in chunks.
Upvotes: 1