Reputation: 126
When i try to convert pdf to image then for some pdfs i get a "out of memory" error. So i increased heap size and then i again got the error for some different pdf file. for the time being assume I have no memory leak from other objects. So what would be the reason for this memory out of error? Would it be just that the image is so large(which is not the case i think) that it consumes heap, or maybe pdfbox stores buffered image of each pages in its memory and this contributes to the growing heap size? Any insight would be wonderful.
Here's the link to the pdf I am trying to render. https://drive.google.com/file/d/0B_Ke2amBgdpeNFFDem5KVVVzanc/view?usp=sharing Here's the code segment.
PDFRenderer pdfRenderer = new PDFRenderer(pdDoc);
BufferedImage image = pdfRenderer.renderImageWithDPI(page-1, 300,ImageType.GRAY);
//image=ImageHelper.convertImageToGrayscale(image);
ImageIOUtil.writeImage(image,"G:/Trial/tempImg.png", 300);
Please note that for this particular pdf problem was solved by increasing the heap size but what I want to know is that does pdfbox stores buffered images in its memory and contributes to heap size.
Here's another pdf which faced the same issue even after increasing heap size . https://drive.google.com/file/d/0B_Ke2amBgdpedDBtaG1QcW1oYlU/view?usp=sharing In this pdf my code takes forever while rendering page 44. I don't know why this is happening.
Upvotes: 1
Views: 3272
Reputation: 126
Well It seems that this problem is not due to any bug or memory leaks but is due to image size. Proposed solutions:- 1) Increase you Xmx size 2) Switch over to 64- bit JVM.
EDIT:- Thanks for the answers. I am just going to lay it out here. Tests were performed by @Tilman Hausherr and results were that the heap size should be increased.Note that 64 bit jvm was used.
Upvotes: 0