German
German

Reputation: 335

Why doesn't OpenGL draw a polygon in this code?

Here is the simplest code, but it does not represent anything. That isn't possible.
Everything seems to be absolutely correct. However, I only see a black background.
It used to work all the time, but now it doesn't.
The color is correct and the blue triangle should be visible. But nothing.

Code:

#include <iostream>
#include <chrono>
#include <GL/glut.h>

using namespace std;

constexpr auto FPS_RATE = 60;
int windowHeight = 600, windowWidth = 600;

void init();
void displayFunction();
void idleFunction();
double getTime();

double getTime()
{
    using Duration = std::chrono::duration<double>;
    return std::chrono::duration_cast<Duration>(
        std::chrono::high_resolution_clock::now().time_since_epoch()
        ).count();
}

const double frame_delay = 1.0 / FPS_RATE;
double last_render = 0;

void init()
{
    glutDisplayFunc(displayFunction);
    glutIdleFunc(idleFunction);
    glClearColor(0.0, 0.0, 0.0, 0.0);
    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();
    gluOrtho2D(-windowWidth / 2, windowWidth / 2, -windowHeight / 2, windowHeight / 2);
}

void idleFunction()
{
    const double current_time = getTime();
    if ((current_time - last_render) > frame_delay)
    {
        last_render = current_time;
        glutPostRedisplay();
    }
}

void displayFunction()
{
    glClear(GL_COLOR_BUFFER_BIT);

    glBegin(GL_POLYGON);
    glColor3i(0, 0, 1);

    glVertex2i(-50, 0);
    glVertex2i(50, 0);
    glVertex2i(0, 50);
    glVertex2i(100, 50);

    glEnd();
    glutSwapBuffers();
}

int main(int argc, char* argv[])
{
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB);
    glutInitWindowSize(windowWidth, windowHeight);
    glutInitWindowPosition((GetSystemMetrics(SM_CXSCREEN) - windowWidth) / 2, (GetSystemMetrics(SM_CYSCREEN) - windowHeight) / 2);
    glutCreateWindow("Window");
    init();
    glutMainLoop();
    return 0;
}

Upvotes: 1

Views: 177

Answers (1)

Rabbid76
Rabbid76

Reputation: 210878

The issue is glColor3i.

When you would use

glColor3f(0, 0, 1.0f);

then you would see a full blue colored polygon. But when you want to use glColor3i, then the color has to be set to

glColor3i(0, 0, 2147483647); // 2147483647 == 0x7fffffff

to get a polygon with the same blue color.

When you use the the version of glColor with integral signed arguments, like glColor3b, glColor3s or glColor3i, then the full range of the integral value is mapped to the floating point range [-1.0, 1.0]. So for glColor3i the integral values in range [−2.147.483.648, 2.147.483.647] are mapped to [-1.0, 1.0] (See Common integral data types).

The unsigned version of glColor like glColor3ub, glColor3us or glColor3ui maps the integral values to the range [0.0, 1.0]. e.g glColor3ub maps the arguments from [0, 255] to [0.0, 1.0].

Upvotes: 3

Related Questions