Reputation: 22591
What are the valid possible values to pass to internalformat
in WebGL's texImage2D
call? The WebGL spec doesn't seem to make it clear, and I'm not sure which enums from desktop GL documentation will work in WebGL.
WebGL defines: RGB
, RGBA
, RGBA4
, RGB5_A1
, RGB565
. Are all of these guaranteed to be valid internal formats?
Does the type
parameter get taken in to account? If I set internalformat
to RGB
and type
to UNSIGNED_SHORT_5_6_5
, does this guarantee the internal format will be 16-bit? Or does it always use RGB8 or decide based on other factors?
Upvotes: 3
Views: 2611
Reputation: 49572
From my understanding the following parameters are valid in webgl:
internalformat: GL_ALPHA, GL_LUMINANCE, GL_LUMINANCE_ALPHA, GL_RGB, GL_RGBA.
format: GL_ALPHA, GL_RGB, GL_RGBA, GL_LUMINANCE and GL_LUMINANCE_ALPHA.
type: GL_UNSIGNED_BYTE, GL_UNSIGNED_SHORT_5_6_5, GL_UNSIGNED_SHORT_4_4_4_4, and GL_UNSIGNED_SHORT_5_5_5_1.
internalformat must match format
From the opengl ES 2.0 documentation:
internalformat must match format. No conversion between formats is supported during texture image processing. type may be used as a hint to specify how much precision is desired, but a GL implementation may choose to store the texture array at any internal resolution it chooses.
If the webgl documentation does not give enough details maybe the full opengl es 2.0 documentation can help (chapter 3.7.1). From looking at the differences between webgl and opengl es 2.0 there shouldn't be any difference inglTexImage2D
between webgl and opengl es 2.0
Upvotes: 3