Curyous
Curyous

Reputation: 8866

How do you use GL_UNSIGNED_BYTE for texture coordinates?

In the "OpenGL ES Programming Guide for iOS" documentation included with XCode, in the "Best Practices for Working with Vertex Data" section, under the heading "Use the Smallest Acceptable Types for Attributes", it says

Specify texture coordinates using 2 or 4 unsigned bytes (GL_UNSIGNED_BYTE) or unsigned short (GL_UNSIGNED_SHORT).

I'm a bit puzzled. I thought that texture coordinates would be < 1 most of the time, and so would require a float to represent fractional values. How do you use unsigned bytes or unsigned shorts? Divide it by 255 in the shader?

Upvotes: 1

Views: 1323

Answers (1)

Jesse Rusak
Jesse Rusak

Reputation: 57168

If you use GL_UNSIGNED_BYTE you'll end up passing normalized of GL_TRUE, to (for example) glVertexAttribPointer, indicating that the values you're passing are not between 0 and 1, but should be normalized from their full range (eg. 0 to 255) to the normalized range of 0 to 1 before being passed to your shader. (See Section 2.1.2 of the OpenGL ES 2.0 spec for more details).

In other words, when you're passing integer types like unsigned byte, use a "normalized" value of GL_TRUE and use the full range of that type (such as 0 to 255), so a value of 127 would be approximately equivalent to floating point 0.5.

Upvotes: 3

Related Questions