Marwane K.A.
Marwane K.A.

Reputation: 143

Writing fragment shaders: cannot make sense of how the uniforms are defined

I'm trying to make custom filters with Phaser, but I don't get how the uniforms, and vTextureCoord in particular are specified. Here's a JSFiddle (EDIT: Ignore the image, the minimal case lays in the square gradient):

I pulled my hair on this one during the last Ludum Dare, trying to figure out the pixel position within the sprite (i.e. [0,0] on the bottom left corner and [sprite.w, sprite.h] on the top right one)... But I couldn't find any reliable way to compute that whatever the sprite position and size are.

Thanks for your help!


EDIT: As emackey pointed out, it seems like either Phaser or Pixi (not sure at which level it's handled?) uses an intermediate texture. Because of this the uSampler I get is not the original texture, but a modified one, that is, for example, shifted/cropped if the sprite is beyond the top-left corner of the screen. The uSampler and vTextureCoord work well together, so as long as I'm making simple things like color tweaks all seems well, but for toying with texture coordinates it's simply not reliable.

Can a Phaser/Pixi guru explain why it works that way, and what I'm supposed to do to get clear coordinates and work with my actual source texture? I managed to hack a shader by "fixing vTextureCoord" and plugging my texture in iChannel0, but this feels a bit hacky.

Thanks.

Upvotes: 3

Views: 1312

Answers (1)

emackey
emackey

Reputation: 12448

I'm not too familiar with Phaser, but we can shed a little light on what that fragment shader is really doing. Load your jsFiddle and replace the GLSL main body with this:

void main() {
    gl_FragColor = vec4(vTextureCoord.x * 2., vTextureCoord.y * 2., 1., 1.);
    gl_FragColor *= texture2D(uSampler, vTextureCoord) * 0.6 + 0.4;
}

The above filter shader is a combination of the original texture (with some gray added) and your colors, so you can see both the texture and the UVs at the same time.

You're correct that vTextureCoord only goes to 0.5, hence the * 2. above, but that's not the whole story: Try dragging your sprite off the top-left. The texture slides but the texture coordinates don't move!

How is that even possible? My guess is that the original sprite texture is being rendered to an intermediate texture, using some of the sprite's location info for the transform. By the time your custom filter runs, your filter GLSL code is running on what's now the transformed intermediate texture, and the texture coordinates no longer have a known relation to the original sprite texture.

If you run the Chrome Canvas Inspector you can see that indeed there are multiple passes, including a render-to-texture pass. You can also see that the filter pass is using coordinates that appear to be the ratio of the filter area size to the game area size, which in this case is 0.5 on both dimensions.

I don't know Phaser well enough to know if there's a quick fix for any of this. Maybe you can add some uniforms to the filter that would give the shader the extra transform it needs, if you can figure out where that comes from exactly. Or perhaps there's a way to attach a shader directly on the sprite itself (there's a null field of the same name) so you could possibly run your GLSL code there instead of in the filter. I hope this answer has at least explained the "why" of your two questions above.

Upvotes: 2

Related Questions