user29829771
user29829771

Reputation: 1

GL_INVALID_OPERATION in Release mode due to compiler optimization, tracked down to one getter function

I have a game engine project in openGL. It works great in Debug mode, but some sort of optimization when I switch to release is causing me to get OpenGL error 1282:

Debug message (1282): GL_INVALID_OPERATION error generated. State(s) are invalid 
program texture usage.
Source: API
Type: Error
Severity: high

I ended up using #pragma optimize to limit the areas being optimized and eventually found the cause:

#pragma optimize("", off)
auto gse::renderer3d::get_camera() -> camera& {
    return g_camera;
}
#pragma optimize("", on)

So, when this function is optimized, it somehow causes a texture error. This doesn't make sense to me, because all shader binding operations using g_camera are done inside gse::renderer3d, meaning that they directly access the g_camera variable, inside the gse::renderer3d namespace, without using the getter function. But regardless, it makes no sense that the getter is being optimized out, because it is directly accessed multiple times throughout the solution. I have also made sure that g_camera is properly initialized. Even when I set a FORCE_USED flag through the compiler:

#ifdef _MSC_VER
#define FORCE_USED __declspec(dllexport)
#else
#define FORCE_USED __attribute__((used))
#endif

FORCE_USED auto gse::renderer3d::get_camera() -> camera& {
    return g_camera;
}

I get the issue. I am baffled by this, because I can't see why a simple getter is causing this. The camera class being referenced is still optimized by the compiler; only telling it to ignore the actual get_camera() function causes this issue.

Without #pragma optimize(off) before the function

With #pragma optimize(off) before the function

Upvotes: -3

Views: 67

Answers (1)

Blindy
Blindy

Reputation: 67447

Ok so it should be obvious that if you can't figure it out with an actual debugger attached to running code, we can't figure it out from what amounts to absolute crazy talk about disabling optimizations. Because I can assure you that nobody needs to do that, and there are hundreds of thousands of games out there. You are doing it wrong.

That said, I can point you to a way to narrow down the problem: RenderDoc can show you a list of OpenGL function calls with arguments, results and buffer and shader states at the time of each and every call:

enter image description here

enter image description here

enter image description here

Figure out exactly what OpenGL function isn't called in Release mode that is called in Debug mode, and work backwards from it to find why your code doesn't work.

Upvotes: -1

Related Questions