Reputation: 1411
I am currently working on creating a shader in THREE.JS which will act like the normal shader, but using a color as the input to define how it is shaded.
Below is the Code that is causing the problem, and after that is a longer explanation of why I am doing it, and what I think is causing the problem:
fragmentShader: [
"uniform float opacity;",
"varying vec3 vNormal;",
"uniform vec3 color;",
"void main() {",
//THIS IS WHAT I THINK IS SCREWING UP EVERYTHING
//Can I not call normalize with that complex
//of equations within a normalize function?
"gl_FragColor = vec4( normalize(color.r + (vNormal.x*color.r)*.5, color.g + (vNormal.y*color.g)*.5, color.b + (vNormal.z*color.b)*.5), opacity );",
"}"
].join("\n")
A longer description:
I am basically using the same code as the normal shader in the THREE.shaderLib, which is as follows:
'normal': {
uniforms: {
"opacity" : { type: "f", value: 1.0 }
},
vertexShader: [
"varying vec3 vNormal;",
"void main() {",
"vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );",
"vNormal = normalize( normalMatrix * normal );",
"gl_Position = projectionMatrix * mvPosition;",
"}"
].join("\n"),
fragmentShader: [
"uniform float opacity;",
"varying vec3 vNormal;",
"void main() {",
"gl_FragColor = vec4( 0.5 * normalize( vNormal ) + 0.5, opacity );",
"}"
].join("\n")
},
What I am using is basically this, but with a color aspect added, and it is called within a function that defines a new shader, like so:
function assignNewShader(color){
var colorAssign = new THREE.Color(color)
THREE.ShaderLib[color] = {
uniforms: {
"opacity" : { type: "f", value: 1.0 },
"color" : { type: "c", value: colorAssign },
},
vertexShader: [
"varying vec3 vNormal;",
"void main() {",
"vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );",
"vNormal = normalMatrix * normal;",
"gl_Position = projectionMatrix * mvPosition;",
"}"
].join("\n"),
fragmentShader: [
"uniform float opacity;",
"varying vec3 vNormal;",
"uniform vec3 color;",
"void main() {",
"gl_FragColor = vec4(color.r + (vNormal.x*color.r)*.5, color.g + (vNormal.y*color.g)*.5, color.b + (vNormal.z*color.b)*.5, opacity );",
"}"
].join("\n")
}
}
You can see that the biggest difference lies within the 'fragmentShader' section, where the vNormal is used to make the gl_FragColor similar (but not exactly the same) as the color that is given in the function.
My problem is this: As a object is 'scaled' this color difference gets more and more drastic, to the point where all of the colors are only as bright as possible. Because of that, I tried to do the following to the 'fragementShader' section of the code:
fragmentShader: [
"uniform float opacity;",
"varying vec3 vNormal;",
"uniform vec3 color;",
"void main() {",
"gl_FragColor = vec4( normalize(color.r + (vNormal.x*color.r)*.5, color.g + (vNormal.y*color.g)*.5, color.b + (vNormal.z*color.b)*.5), opacity );",
"}"
].join("\n")
When I do this, I am greeted with a PLETHORA of errors including:
ERROR: 0:37: 'normalize' : no matching overloaded function found ERROR: 0:37: 'constructor' : not enough data provided for construction
WebGL: INVALID_VALUE: attachShader: no object or object deleted
Could not initialise shader VALIDATE_STATUS: false, gl error [1281]
WebGL: INVALID_OPERATION: getUniformLocation: program not linked
I am definitly in over my head getting into the webGL part of THREE, but it seems to me that this mode of altering the fragment shader should work. Does anybody have any ideas as to why it might not?
Upvotes: 1
Views: 2449