Reputation: 274
I have a very simple lwjgl program:
package com.github.fabioticconi;
import org.lwjgl.glfw.GLFWErrorCallback;
import org.lwjgl.opengl.GL;
import org.lwjgl.system.MemoryUtil;
import java.nio.FloatBuffer;
import static org.lwjgl.glfw.GLFW.*;
import static org.lwjgl.opengl.GL11.*;
import static org.lwjgl.opengl.GL15.*;
import static org.lwjgl.opengl.GL20.*;
import static org.lwjgl.opengl.GL30.glBindVertexArray;
import static org.lwjgl.opengl.GL30.glGenVertexArrays;
import static org.lwjgl.system.MemoryUtil.NULL;
/**
* Author: Fabio Ticconi
* Date: 10/03/18
*/
public class Main
{
public static void main(final String[] args)
{
// Setup an error callback. The default implementation
// will print the error message in System.err.
GLFWErrorCallback.createPrint(System.err).set();
// Initialise GLFW
if (!glfwInit())
throw new IllegalStateException("Unable to initialize GLFW");
glfwWindowHint(GLFW_RESIZABLE, GL_FALSE);
// IF I ENABLE THE BELOW, I DON'T SEE THE TRIANGLE!
// glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
// glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 2);
// glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
// glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
// Open a window and create its OpenGL context
final long window = glfwCreateWindow(1024, 768, "Test", NULL, NULL);
if (window == NULL)
throw new RuntimeException("Failed to create the GLFW window");
glfwMakeContextCurrent(window);
// This line is critical for LWJGL's interoperation with GLFW's
// OpenGL context, or any context that is managed externally.
// LWJGL detects the context that is current in the current thread,
// creates the GLCapabilities instance and makes the OpenGL
// bindings available for use.
GL.createCapabilities();
// Ensure we can capture the escape key being pressed below
glfwSetInputMode(window, GLFW_STICKY_KEYS, GL_TRUE);
final float[] vertices = new float[] { 0.0f, 0.5f, 0.0f, -0.5f, -0.5f, 0.0f, 0.5f, -0.5f, 0.0f };
int vaoId;
final int vboId;
FloatBuffer verticesBuffer = null;
try
{
verticesBuffer = MemoryUtil.memAllocFloat(vertices.length);
verticesBuffer.put(vertices).flip();
// Create the VAO and bind to it
vaoId = glGenVertexArrays();
glBindVertexArray(vaoId);
// Create the VBO and bint to it
vboId = glGenBuffers();
glBindBuffer(GL_ARRAY_BUFFER, vboId);
glBufferData(GL_ARRAY_BUFFER, verticesBuffer, GL_STATIC_DRAW);
// Define structure of the data
glVertexAttribPointer(0, 3, GL_FLOAT, false, 0, 0);
// Unbind the VBO
glBindBuffer(GL_ARRAY_BUFFER, 0);
// Unbind the VAO
glBindVertexArray(0);
} finally
{
if (verticesBuffer != null)
{
MemoryUtil.memFree(verticesBuffer);
}
}
do
{
// Swap buffers
glfwSwapBuffers(window);
glfwPollEvents();
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// Bind to the VAO
glBindVertexArray(vaoId);
glEnableVertexAttribArray(0);
// Draw the vertices
glDrawArrays(GL_TRIANGLES, 0, 3);
// Restore state
glDisableVertexAttribArray(0);
glBindVertexArray(0);
} // Check if the ESC key was pressed or the window was closed
while (glfwGetKey(window, GLFW_KEY_ESCAPE) != GLFW_PRESS && !glfwWindowShouldClose(window));
}
}
When I run it, I see a white triangle. Which is what I expected to see.
If, however, I uncomment this lines:
// glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
// glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 2);
// glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
// glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
I don't see the triangle, just the black window. Experimenting a bit, it looks like whenever I set a context version of 3 or more, unless I use GLFW_OPENGL_COMPAT_PROFILE, it won't display the triangle.
I'm on Linux, with nvidia proprietary drivers apparently supporting up to OpenGL 4.5:
$ glxinfo | grep -i open
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GTX 970M/PCIe/SSE2
OpenGL core profile version string: 4.5.0 NVIDIA 384.111
OpenGL core profile shading language version string: 4.50 NVIDIA
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 4.5.0 NVIDIA 384.111
OpenGL shading language version string: 4.50 NVIDIA
OpenGL context flags: (none)
OpenGL profile mask: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 384.111
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
OpenGL ES profile extensions:
so I don't really get why lwjgl seems to silently fail with OpenGL 3 or 4 compatibility.
Those lines are apparently needed to support MacOS (from all tutorials of OpenGL and lwjgl I've come across), and in general I'd like to know what's going on.
Any clues?
Upvotes: 0
Views: 420
Reputation: 22174
In a OpenGL core profile it is mandatory to use your own shaders. There is no fallback that would be used if you don't supply one.
I'm also not convinced that this really silently fails since you are never querying glGetError()
. Most probably you get an error from the glDrawArrays
line.
Upvotes: 3