Reputation: 1533
In my app I'm doing some image pixels manipulations. My code based on THIS example.
The only change that I've made is to change the FRAGMENT_SHADER
to a grayscale and it looks like this:
private static final String FRAGMENT_SHADER =
"#extension GL_OES_EGL_image_external : require\n" +
"precision mediump float;\n" + // highp here doesn't seem to matter
"varying vec2 vTextureCoord;\n" +
"uniform samplerExternalOES sTexture;\n" +
"void main() {\n" +
" vec4 tc = texture2D(sTexture, vTextureCoord);\n" +
" gl_FragColor.r = tc.r * 0.3 + tc.g * 0.59 + tc.b * 0.11;\n" +
" gl_FragColor.g = tc.r * 0.3 + tc.g * 0.59 + tc.b * 0.11;\n" +
" gl_FragColor.b = tc.r * 0.3 + tc.g * 0.59 + tc.b * 0.11;\n" +
"}\n";
The problem: I've recorded a video from Galaxy s7 device, after recording I've taken the recorded video and I was reading the first frame from 2 different devices(Galaxy S7 and Galaxy s3), I found that the pixel values are absolutely different between those devices.
Does someone know why does it happen? and what can I do in order to solve this issue, since my algorithm fails because of this differences?
Update:
This is example of the diferences I've got.
Gaaxy S3: Part of the matrix 213, 214, 212, 214, 216, 214, 213, 212, 212, 212, 213, 214, 214, 214, 213, 213, 214, 214, 214, 214, 212, 213, 212, 213, 212, 214, 214, 212, 212, 210, 211, 210, 211, 210, 211, 211, 214, 211, 214, 213, 213, 214, 214, 216, 216, 216, 215, 215, 216, 212, 213, 213, 214, 213, 213, 212, 211, 209, 209, 207, 208, 208, 210, 211, 209, 207, 209, 210, 217, 219, 216, 209, 209, 210, 210, 210, 211, 209, 207, 205, 205, 206, 210, 210, 220, 211, 202, 210, 211, 206, 206, 209, 210, 211, 213, 219, 222, 216, 217, 217
count of non zeroes pixels: 1632816
Sum of all the pixels: 3.1834445E8
Galaxy S7: same part of the Matrix as Galaxy 3 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164, 164
count of non-zeros pixels: 1063680
Sum of all the pixels: 1.6593408E8
Update2:
I found that the image received is absolutely ruined, but the video recorded well.
This is the good image from Galaxy S3:
And this is the image I've got from Galaxy S7 (Same frame#)
I have no Idea at all what is going here, but I know that the last image is the same for all Marshmallow devices(Galaxy S6,S7 and Huawei)
Upvotes: 6
Views: 173
Reputation: 1533
Ok after a week of hard work to find the solution, it was found!
As I said I based on bigflake example and in this example there is an option to invert the frame and in the example that what was done.
By changing the invert to false the problem was fixed.
I'll really appreciate if someone can explain me the reason that there is invert to the frame, and why it was decided to set it as "true" by default.
Thannks for all your help!
This is the change that I've made if it wasn't clear enough:
outputSurface.drawImage(false);
Upvotes: 1