Reputation: 736
I am trying to do server-side rendering for a problem that I am working on. EGL provides a way to define a context for OpenGL without the need for a windowing system. I have been able to successfully render offscreen using EGL on my laptop, but when I try to run to the code on an instance on digitalocean EGL fails to initialize. The ability to run this code on a compute resource from a cloud provider is one of the use-cases I need to support.
I want to know if EGL is a viable approach but I don't understand why it is failing. Does it require a GPU? Is this a problem with running on a virtual machine?
The following code reproduces the problem I am experiencing,
#include <EGL/egl.h>
#include <assert.h>
int main(int argc, char** argv) {
EGLDisplay display = eglGetDisplay(EGL_DEFAULT_DISPLAY);
assert(display != EGL_NO_DISPLAY);
EGLBoolean result = eglInitialize(display, NULL, NULL);
//assert(result != EGL_FALSE);
EGLint errcode = eglGetError();
assert(errcode == EGL_SUCCESS);
return 0;
}
The error code returned after calling eglInitialize is EGL_NOT_INITIALIZED and, from the header, this means "EGL is not initialized, or could not be initialized, for the specified EGL display connection." The default display is returned without error, so I assume that the problem is that it could not be initialized. So I am trying to work out why is was not initialized.
Upvotes: 3
Views: 1666
Reputation: 6118
If you want to use EGL with hardware acceleration, you need a GPU. So a server without GPU provides little benefit.
If you want to still render on the server in software and use the OpenGL API you can look into the mesaGL software implementation.
But if you are rendering in software, you can start using any other approaches, such as a a software ray tracer, such as pov ray
Upvotes: 3