Reputation: 123
I'm trying to save frame from WebRTC Android appication. In SurfaceViewRenderer.java it draws YUV frame using GLES shader. To save the drawn frame I added saveFrame() from grafika to SurfaceViewRenderer.java but when I call saveFrame() it doesn't work(the bitmap is empty). I thought glReadPixels() read pixels from current color buffer but it seems it wasn't called in current EGLcontext? How to call glReadPixels() to save current frame?
the codes what I wrote are like this:
In MainActivity I added button like this.
button.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
bitmap = localSurfaceViewRender.saveFrame();
if (bitmap != null) {
Toast.makeText(getApplicationContext(), "Saved!!", Toast.LENGTH_SHORT).show();
imageView.setImageBitmap(bitmap);
}
}
});
In SurfaceViewRenderer.java I added some methods to get values from currently binded egl
public int getSurfaceHeight() {
if (mHeight < 0) {
return eglBase.surfaceHeight();
} else {
return mHeight;
}
}
public int getSurfaceWidth() {
if (mWidth < 0) {
return eglBase.surfaceWidth();
} else {
return mWidth;
}
}
public Bitmap saveFrame() {
int width = eglBase.surfaceWidth();
int height = eglBase.surfaceHeight();
ByteBuffer buf = ByteBuffer.allocateDirect(width * height * 4);
buf.order(ByteOrder.LITTLE_ENDIAN);
GLES20.glFinish();
GLES20.glReadPixels(0, 0, width, height, GLES20.GL_RGB, GLES20.GL_UNSIGNED_BYTE, buf);
GlUtil.checkNoGLES2Error("glReadPixels");
buf.rewind();
Bitmap bmp = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
bmp.copyPixelsFromBuffer(buf);
return bmp;
}
Edit: This code is the full code of saveFrame
// save as a file
public void saveFrame(final File file) {
runOnRenderThread(new Runnable() {
@Override
public void run() {
int width = eglBase.surfaceWidth();
int height = eglBase.surfaceHeight();
ByteBuffer buf = ByteBuffer.allocateDirect(width * height * 4);
buf.order(ByteOrder.LITTLE_ENDIAN);
GLES20.glFinish();
GLES20.glReadPixels(0, 0, width, height, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, buf);
GlUtil.checkNoGLES2Error("glReadPixels");
buf.rewind();
// fliping by vertical
byte[] tmp = new byte[width * 4];
for (int i = 0; i < height / 2; i++) {
buf.get(tmp);
System.arraycopy(buf.array(), buf.limit() - buf.position(),
buf.array(), buf.position() - width * 4, width * 4);
System.arraycopy(tmp, 0, buf.array(), buf.limit() - buf.position(), width * 4);
}
buf.rewind();
String filename = file.toString();
BufferedOutputStream bos = null;
try {
bos = new BufferedOutputStream(new FileOutputStream(filename));
Bitmap bmp = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
bmp.copyPixelsFromBuffer(buf);
bmp.compress(Bitmap.CompressFormat.JPEG, 100, bos);
bmp.recycle();
} catch (FileNotFoundException e) {
e.printStackTrace();
} finally {
if (bos != null) try {
bos.close();
} catch (IOException e) {
e.printStackTrace();
}
}
Log.d(TAG, "Saved " + width + "x" + height + " frame as '" + filename + "'");
}
});
}
Upvotes: 3
Views: 1381
Reputation: 52323
The glReadPixels()
function reads the contents of a framebuffer from the current context. Once eglSwapBuffers()
is called, the contents are gone.
Put another way, apps can't read anything out of a SurfaceView. So you have to read the pixels out just before you submit the buffer to the compositor.
(I'm not actually sure what happens if you call glReadPixels()
right after swapping buffers... it's possible you'll read a previously-sent frame from a recycled buffer, but you can't always rely on that.)
You appear to be calling glReadPixels()
from the main UI thread. If the SurfaceView rendering is occurring on a different thread, you would need to access the framebuffer from that same thread, because that's where the EGL context is current. A given context can't be current in more than one thread at a time.
Upvotes: 2