Reputation: 153
I am trying to render a texture that was generated by the camera on the iPhone screen. I downloaded the color tracking example from Brad Larson on http://www.sunsetlakesoftware.com/2010/10/22/gpu-accelerated-video-processing-mac-and-ios (direct link for sample code: http://www.sunsetlakesoftware.com/sites/default/files/ColorTracking.zip).
In the ColorTrackingViewController drawFrame method he uses the following code to generate vertex and corresponding texture coordinates for rendering a textured square:
static const GLfloat squareVertices[] = {
-1.0f, -1.0f,
1.0f, -1.0f,
-1.0f, 1.0f,
1.0f, 1.0f,
};
static const GLfloat textureVertices[] = {
1.0f, 1.0f,
1.0f, 0.0f,
0.0f, 1.0f,
0.0f, 0.0f,
};
I don't understand why these texture coordinates work correctly.
In my opinion, and in another example code I have seen that works also correctly, they should be:
static const GLfloat textureVertices[] = {
0.0f, 1.0f,
1.0f, 1.0f,
0.0f, 0.0f,
1.0f, 0.0f,
};
I went through the whole code, but I cannot figure out why the above texture coordinates work correctly. What am I missing?
Upvotes: 0
Views: 279
Reputation: 26
I believe it is because the image data from the iphone camera is always presented rotated 90 CCW. To counteract that rotation he's setting the texture co-ordinates to be rotated 90 CCW too. Sometimes two wrongs do make a right?
Upvotes: 1