J.Doe
J.Doe

Reputation: 1552

OpenGL ES orthographic projection matrix not working

So my goal is simple. I am trying to get my coordinate space set up, so that the origin is at the bottom left of the screen, and the top right coordinates are (screen.width, screen.height).

Also this is a COMPLETELY 2d engine, so no 3d stuff is needed. I just need those coordinates to work.

Right now I am trying to plot a couple points on the screen. Mostly at places like (0, 0), (width, height), (width / 2, height /2) etc so I can see if things are working right.

Unfortunately right now my efforts to get this going are in vain, and instead of having multiple points I have one in the dead center of the device (obviously they are all overlapping).

So here is my code what exactly am I doing wrong?

Vertex Shader

uniform vec4 color;
uniform float pointSize;
uniform mat4 orthoMatrix;

attribute vec3 position;

varying vec4 outColor;
varying vec3 center;

void main() {
    center = position;
    outColor = color;
    gl_PointSize = pointSize;
    gl_Position = vec4(position, 1) * orthoMatrix;
}

And here is how I make the matrix. I am using GLKit so it is theoretically making the orthographic matrix for me. However If you have a custom function you think would better do this then that is fine! I can use it too.

var width:Int32 = 0
var height:Int32 = 0
var matrix:[GLfloat] = []
func onload()
{
   width = Int32(self.view.bounds.size.width)
   height = Int32(self.view.bounds.size.height)
   glViewport(0, 0, GLsizei(height), GLsizei(width))
   matrix = glkitmatrixtoarray( GLKMatrix4MakeOrtho(0, GLfloat(width), 0,       GLfloat(height), -1, 1))
}
func glkitmatrixtoarray(mat: GLKMatrix4) -> [GLfloat]
    {
        var buildme:[GLfloat] = []
        buildme.append(mat.m.0)
        buildme.append(mat.m.1)
        buildme.append(mat.m.3)
        buildme.append(mat.m.4)
        buildme.append(mat.m.5)
        buildme.append(mat.m.6)
        buildme.append(mat.m.7)
        buildme.append(mat.m.8)
        buildme.append(mat.m.9)
        buildme.append(mat.m.10)
        buildme.append(mat.m.11)
        buildme.append(mat.m.12)
        buildme.append(mat.m.13)
        buildme.append(mat.m.15)
        return buildme

    }

Passing it over to the shader

func draw()
{
        //Setting up shader for use
        let loc3 = glGetUniformLocation(program, "orthoMatrix")
        if (loc3 != -1)
        {
            //glUniformMatrix4fv(loc3, 1, GLboolean(GL_TRUE), &matrix[0])
            glUniformMatrix4fv(loc3, 1, GLboolean(GL_TRUE), &matrix[0])
        }
        //Passing points and extra data

}

Note: If you remove the multiplication with the matrix in the vertex shader the points show up, however obiously most of them are off screen because of how default OpenGL works.

Also: I have tried using this function rather then glKit's method. Same results. Perhaps I am not passing there might things into the matrix making function, or maybe im not getting it to the shader properly.

EDIT: I have thrown up the project file incase you want to see how everything goes.

Upvotes: 0

Views: 532

Answers (1)

J.Doe
J.Doe

Reputation: 1552

OK I finally figured this out! What I did 1. I miscounted when turning the glkit matrix to an array.
2. When passing the matrix as a uniform you actually want the address of the whole array not just the beginning element. 3. GL_FALSE is not a proper argument when passing the matrix to the shader.

Thankyou reto matic

Upvotes: 1

Related Questions