DLH
DLH

Reputation: 565

Python OpenCV/Numpy memory allocation error--on only the better of two machines?

I have two machines running identical Python scripts that use OpenCV to convert image file formats. The script runs fine on my low-end notebook, which has 4 GB of memory. On my desktop, however, which has 32 GB, I get the following error:

OpenCV Error: Unspecified error (The numpy array of typenum=2, ndims=3 can not be created) in NumpyAllocator::allocate, file D:\Build\OpenCV\opencv-3.3.1\modules\python\src2\cv2.cpp, line 175 OpenCV Error: Insufficient memory (Failed to allocate 243000000 bytes) in cv::OutOfMemoryError, file D:\Build\OpenCV\opencv-3.3.1\modules\core\src\alloc.cpp, line 55

(1) The code that causes this error is as follows. No other code in the script uses OpenCV.

# png and jpg are filenames
img = cv2.imread(png)
cv2.imwrite(jpg, img, [cv2.IMWRITE_JPEG_QUALITY, 85])

(2) Both machines are running Windows 10 on a 64-bit AMD CPU.

(3) On both machines, Python is running in 32 bit mode, according to sys.maxsize.

(4) Both machines were running Python 3.6.2. I tried updating the desktop to 3.6.3, but it made no difference.

(5) Both machines have OpenCV version 3.3.1

(6) The desktop on which I get the memory error is using a slightly newer version of NumPy (1.13.3) compared to 1.13.1 on the notebook where all is well.

(7) The script will convert smaller images without error, but chokes on a 9000 x 9000 pixel PNG. I realize this isn't small, but still, even this large image works just fine on the notebook.

I did try to search for any information that might suggest that NumPy 1.13.3 was known to break things since it was the only difference I could identify, but I couldn't find anything suggesting such a problem.

Thanks in advance to anyone who can help explain the problem and how to fix it.

Upvotes: 1

Views: 1741

Answers (1)

DLH
DLH

Reputation: 565

It turns out that all of the packages I needed had 64-bit versions available, so I got things working by switching to the 64-bit Python setup.

In case it helps someone else, here's what I learned along the way:

Since I was working in a 32-bit Python environment, the boatloads of RAM on my desktop machine weren't really relevant. In 32-bit mode, I can't possibly access more than 4 GB. In reality, I can't even get that since all sorts of things compete for that 4 GB block, and if I understand correctly, no more than 2 GB will be allotted to any 32-bit Python process anyway.

With Numpy, things get even worse because Numpy arrays require a contiguous block of memory. That makes things tight, apparently tight enough that the 243 MB I needed for the image wasn't available. It wouldn't necessarily require a memory leak for this problem to occur. If things were tight, just the normal (likely somewhat memory-intensive) drawing operations I did with pycairo could have left too little for the subsequent image conversion. (The Surface object had not been released because it would be used in subsequent iterations.)

The surprising part--at least to me--was that the amount of contiguous memory available for an operation within this 2 GB max can vary wildly from machine to machine, and even from day to day, depending on all sort of things that aren't obvious. It appears that my notebook just happens to has some fortuitious circumstances that leave my 243 MB of continuous memory, while my desktop doesn't.

Thanks to those who offered advice.

Upvotes: 1

Related Questions