doughsay
doughsay

Reputation: 316

Getting "Invalid memory access (signal 11)" when using OpenGL Crystal bindings (glGenBuffers)

Please see here, the trivial program I'm trying to get working:

require "lib_gl"

x = [] of UInt32
# or
x = uninitialized Pointer(UInt32)
# or
x = [0_u32]

# From lib_gl:
# fun gen_buffers = "glGenBuffers"(n: Int32, buffers: UInt32*) : Void
LibGL.gen_buffers(1, x)

Will crash with this error:

Invalid memory access (signal 11) at address 0x1428
[0x10560900b] *CallStack::print_backtrace:Int32 +107
[0x1055f4a2c] __crystal_sigfault_handler +60
[0x7fff90988b3a] _sigtramp +26
[0x7fff7f455e83] glGenBuffers +19
[0x1055e4836] __crystal_main +1222
[0x1055f4928] main +40

I'm brand new to Crystal, so I'm not sure if I'm doing something wrong with the value of x being passed into the function, or if there's something deeper wrong with either the bindings or crystal itself. Hoping some Crystal+OpenGL expert on here might be able to help!

References:

Upvotes: 1

Views: 381

Answers (1)

doughsay
doughsay

Reputation: 316

I answered my own question before posting, but still thought it should be shared:

This happens if there is no initialized OpenGL context. A revised (working) example is below, using the GLFW and LibGLFW libraries to open a window and set the current context:

require "glfw"
require "lib_glfw"
require "lib_gl"

LibGLFW.init
window = GLFW::Window.new(800, 600, "Foo")
window.set_context_current

x = [0_u32]

LibGL.gen_buffers(1, x)

puts x

Please note though, the array MUST be initialized with the expected number of "return" values. using x = [] of UInt32 will crash with the same error. So make sure you've created an array of n 0's, where n is the number of buffers you're asking to be initialized.

Upvotes: 2

Related Questions