Tunji_D
Tunji_D

Reputation: 3687

Android Camera2 pipeline: How do I encode h.264 units using MediaCodec from an input Surface?

I have an Android application using the Camera2 API. The ultimate goal is to get h264 units to write to a stream. So far I have

  1. Successfully created a capture session and can write to preview, local recording and streaming surfaces via:
    session.device.createCaptureRequest(CameraDevice.TEMPLATE_RECORD).run {
                    addTarget(previewSurface)
                    addTarget(recorder.surface)
                    addTarget(streamer.surface)
                    set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, Range(args.fps, args.fps))
                    build()
                }
  1. Set up a MediaCodec to encode data from the streamer.suface param above where the surface is derived from a call to mediaCodec.createInputSurface from the MediaCodec created as follows
    internal fun streamingCodec(args: CameraFragmentArgs): MediaCodec {
        val mediaFormat = MediaFormat.createVideoFormat("video/avc", args.width, args.height).apply {
            setInteger(MediaFormat.KEY_BIT_RATE, 2000 * 1024)
            setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 2)
            setInteger(MediaFormat.KEY_FRAME_RATE, args.fps)
            setInteger(
                MediaFormat.KEY_COLOR_FORMAT,
                MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface
            )
        }

        val encoderName = MediaCodecList(MediaCodecList.REGULAR_CODECS).findEncoderForFormat(mediaFormat)

        return MediaCodec.createByCodecName(encoderName).apply {
            configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE)
        }
    }
  1. Used the asynchronous callback for when the surface above has information in it's ByteBuffer
private class StreamingCallBack) : MediaCodec.Callback() {

    override fun onInputBufferAvailable(codec: MediaCodec, index: Int) = Unit

    override fun onOutputBufferAvailable(codec: MediaCodec, index: Int, info: BufferInfo) {

       val byteBuffer = codec.getOutputBuffer(index)
       // Is the data in the buffer properly encoded as h.264 here? Did I need to use MediaExtractor?
    }

    override fun onOutputFormatChanged(codec: MediaCodec, format: MediaFormat) = Unit

    override fun onError(codec: MediaCodec, e: MediaCodec.CodecException) {
        Log.i("TEST", "onError in codec")
    }

}

My confusion is, at this point, are the values in the ByteBuffer encoded properly? Do I need to use MediaExtractor to clean up the data coming in from the input Surface before I pass it the MediaCodec to encode? The pipeline is clean enough, but I'm not quite sure what's necessary or not. This document has been the biggest guide, and it mentions the MediaCodec operates on raw data, which makes me think I need the MediaExtractor, but that doesn't take a Surface as an input making the right order of items in the pipeline all the more confusing.

Upvotes: 2

Views: 1832

Answers (1)

Eddy Talvala
Eddy Talvala

Reputation: 18117

You don't need MediaExtractor - that's for processing a complete container file and splitting out its various streams and other components.

The MediaCodec receives the raw image buffers from the camera directly, and will output encoded buffers. If you want to save a standard video file, you'll need to feed those encoded ByteBuffers into a MediaMuxer instance. If you're just sending the encoded buffers elsewhere for decode (like for a video chat application), you can just feed the ByteBuffers to a MediaCodec at your destination.

I can't speak to whether all your parameters to MediaCodec are correct, but I don't see anything obviously wrong.

Upvotes: 4

Related Questions