Reputation: 2066
I am developing a webrtc video call Android app, it is working more than fine, I need to record the video of the other peer (remoteVideoStream) and myStream (localVideoStream) and convert it to some saveable format like mp4 or any other format, I really searched for that, but without being able to figure out how to do the job.
I have read about VideoFileRenderer, I tried to add it to my code to save the video but could not use it as well it has not any method called for example record() or save(), although it has a method called release() which will be used to end saving the video. Here is the class if any one has any idea:
@JNINamespace("webrtc::jni")
public class VideoFileRenderer implements Callbacks, VideoSink {
private static final String TAG = "VideoFileRenderer";
private final HandlerThread renderThread;
private final Handler renderThreadHandler;
private final FileOutputStream videoOutFile;
private final String outputFileName;
private final int outputFileWidth;
private final int outputFileHeight;
private final int outputFrameSize;
private final ByteBuffer outputFrameBuffer;
private EglBase eglBase;
private YuvConverter yuvConverter;
private ArrayList<ByteBuffer> rawFrames = new ArrayList();
public VideoFileRenderer(String outputFile, int outputFileWidth, int outputFileHeight, final Context sharedContext) throws IOException {
if (outputFileWidth % 2 != 1 && outputFileHeight % 2 != 1) {
this.outputFileName = outputFile;
this.outputFileWidth = outputFileWidth;
this.outputFileHeight = outputFileHeight;
this.outputFrameSize = outputFileWidth * outputFileHeight * 3 / 2;
this.outputFrameBuffer = ByteBuffer.allocateDirect(this.outputFrameSize);
this.videoOutFile = new FileOutputStream(outputFile);
this.videoOutFile.write(("YUV4MPEG2 C420 W" + outputFileWidth + " H" + outputFileHeight + " Ip F30:1 A1:1\n").getBytes(Charset.forName("US-ASCII")));
this.renderThread = new HandlerThread("VideoFileRenderer");
this.renderThread.start();
this.renderThreadHandler = new Handler(this.renderThread.getLooper());
ThreadUtils.invokeAtFrontUninterruptibly(this.renderThreadHandler, new Runnable() {
public void run() {
VideoFileRenderer.this.eglBase = EglBase.create(sharedContext, EglBase.CONFIG_PIXEL_BUFFER);
VideoFileRenderer.this.eglBase.createDummyPbufferSurface();
VideoFileRenderer.this.eglBase.makeCurrent();
VideoFileRenderer.this.yuvConverter = new YuvConverter();
}
});
} else {
throw new IllegalArgumentException("Does not support uneven width or height");
}
}
public void renderFrame(I420Frame i420Frame) {
VideoFrame frame = i420Frame.toVideoFrame();
this.onFrame(frame);
frame.release();
}
public void onFrame(VideoFrame frame) {
frame.retain();
this.renderThreadHandler.post(() -> {
this.renderFrameOnRenderThread(frame);
});
}
private void renderFrameOnRenderThread(VideoFrame frame) {
Buffer buffer = frame.getBuffer();
int targetWidth = frame.getRotation() % 180 == 0 ? this.outputFileWidth : this.outputFileHeight;
int targetHeight = frame.getRotation() % 180 == 0 ? this.outputFileHeight : this.outputFileWidth;
float frameAspectRatio = (float)buffer.getWidth() / (float)buffer.getHeight();
float fileAspectRatio = (float)targetWidth / (float)targetHeight;
int cropWidth = buffer.getWidth();
int cropHeight = buffer.getHeight();
if (fileAspectRatio > frameAspectRatio) {
cropHeight = (int)((float)cropHeight * (frameAspectRatio / fileAspectRatio));
} else {
cropWidth = (int)((float)cropWidth * (fileAspectRatio / frameAspectRatio));
}
int cropX = (buffer.getWidth() - cropWidth) / 2;
int cropY = (buffer.getHeight() - cropHeight) / 2;
Buffer scaledBuffer = buffer.cropAndScale(cropX, cropY, cropWidth, cropHeight, targetWidth, targetHeight);
frame.release();
I420Buffer i420 = scaledBuffer.toI420();
scaledBuffer.release();
ByteBuffer byteBuffer = JniCommon.nativeAllocateByteBuffer(this.outputFrameSize);
YuvHelper.I420Rotate(i420.getDataY(), i420.getStrideY(), i420.getDataU(), i420.getStrideU(), i420.getDataV(), i420.getStrideV(), byteBuffer, i420.getWidth(), i420.getHeight(), frame.getRotation());
i420.release();
byteBuffer.rewind();
this.rawFrames.add(byteBuffer);
}
public void release() {
CountDownLatch cleanupBarrier = new CountDownLatch(1);
this.renderThreadHandler.post(() -> {
this.yuvConverter.release();
this.eglBase.release();
this.renderThread.quit();
cleanupBarrier.countDown();
});
ThreadUtils.awaitUninterruptibly(cleanupBarrier);
try {
Iterator var2 = this.rawFrames.iterator();
while(var2.hasNext()) {
ByteBuffer buffer = (ByteBuffer)var2.next();
this.videoOutFile.write("FRAME\n".getBytes(Charset.forName("US-ASCII")));
byte[] data = new byte[this.outputFrameSize];
buffer.get(data);
this.videoOutFile.write(data);
JniCommon.nativeFreeByteBuffer(buffer);
}
this.videoOutFile.close();
Logging.d("VideoFileRenderer", "Video written to disk as " + this.outputFileName + ". Number frames are " + this.rawFrames.size() + " and the dimension of the frames are " + this.outputFileWidth + "x" + this.outputFileHeight + ".");
} catch (IOException var5) {
Logging.e("VideoFileRenderer", "Error writing video to disk", var5);
}
}
}
I can't find any helpful method that can help.
Upvotes: 7
Views: 5255
Reputation: 2066
To be able to record the video I had to do as @Onix said, but fortunately I found many implementations here is what I picked: https://chromium.googlesource.com/external/webrtc/+/master/sdk/android/api/org/webrtc/VideoFileRenderer.java
You can find another implementation here: https://chromium.googlesource.com/external/webrtc/+/f33970b15e0eeb46548fa602f6d0c1fcfd44dd19/webrtc/api/android/java/src/org/webrtc/VideoFileRenderer.java but this is not going to work in the updated version of webrtc, so I picked the upper one.
All left now is to create an instance of that new class VideoFileRenderer
(the implementation of the VideoSink
I am attaching) after the stream is ready and working, and once I want to stop video recording I just needed to call the method
release()
Upvotes: 0
Reputation: 692
VideoFileRenderer class just demonstrates how you can access to decoded raw video frames for remote/local peer.
This is not recording valid video file.
You should implement manually the logic of encoding and muxing raw video frames into container, like mp4.
The main flow looks like that:
Upvotes: 6