Makogan
Makogan

Reputation: 9642

OpenGL not throwing error and not showing difference in memory allocation for large 3D textures

I am allocating a large 3D texture and seeing differences in program output, I suspect these differences to be caused by the system not having enough memory for the texture, but I am not sure how to verify and dynamically correct the issue.

The texture is allocated as:

glTexImage3D(GL_TEXTURE_3D, 0, GL_RGBA32F, side, side, side, 0, GL_RGBA, GL_FLOAT, NULL);

When side is 512 the program works, when it is 1024 it doesn't.

However glGetError returns GL_NO_ERROR for both values.

In addition to that

glGetIntegerv(GL_GPU_MEMORY_INFO_CURRENT_AVAILABLE_VIDMEM_NVX,&x);

returns 7729 MB available out of 8192 regardless of the value of size.

Moreover calling glGetIntegerv(GL_MAX_TEXTURE_SIZE, temp*) tells me that the maximum texture side is supposed to be 32768 (although I know that is for 2D textures and it probably does not apply to 3D textures).

I want to have the system crash or otherwise report to me that I am trying to allocate too much memory, or at least have a way to check myself if this is the case, alas none of the methods that I have been suggested seem to work.

Update:

I have played around with stuff and found things that I am not happy with. First, I am able to get an out of memory error if I allocate textures that are large enough.

So my program has the following behavior:

-Small textures work -Medium textures do not work and do not report errors -Large textures do not work and report errors

I can't accept this, I need there to be only textures that work or textures that do not work and warn me about it, I cannot have textures that don't work without ever raising flags or issues.

Second, the treshohold for working vs not working fluctuates over time on the same machine independently of program execution.

So I ran the program a good chunk of times found a working vs not working threshold, then closed it and read some documentation, launched it again, and the treshold changed (so the threshold is stable across multiple program instances for a small period but not stable on the same machine over a long period). This has happened multiple times already. The working point is the same for multiple program launches when those instances are close to each other in time, but not if they are far away in time.

Checking memory using the nvidia extension results in the exam same code, so I have no reliable way of detecting when I have messed up.

TL;DR

I just want to know that I have messed up, it can be after the fact, I just want to eventually be told that my texture is too big and OpenGL failed to create the texture, I can't believe that there is no way to check for this, there has to be a way to someow detect that a texture is not working other than inspecting visual output.

Upvotes: 0

Views: 231

Answers (1)

Nicol Bolas
Nicol Bolas

Reputation: 474546

The enumerator GL_MAX_3D_TEXTURE_SIZE defines how big 3D textures can be, and 4.6 implementations are required to support at least 2048 per dimension.

Of course, this says nothing about memory limitations.

OpenGL implementations are not required to fulfill the execution of any function immediately (unless the function specifically says it does). They can defer things like memory allocation until whenever they like. As such, the error GL_OUT_OF_MEMORY can be thrown at any time, regardless of when the operation that actually runs out of memory was executed.

I just want to be told when I am trying to do something the system can't handle.

This is why Vulkan exists; OpenGL is simply not equipped to answer those questions immediately.

Upvotes: 2

Related Questions