Reputation: 1322
I'm creating Bitmaps and I want them to be (hardware-)encoded and muxed into .mp4 file.
I'm using MediaCodec class to encode and MediaMuxer to mux.
The problem is that encoder (MediaCodec) gives me too small input buffers. No matter what color format I choose (in MediaFormat object passed to encoder), it always returns to me buffers that are equal to resWidth * resHeight * 1.5
which kinda looks like it wants me to use 12-bit color coding.
For example when I choose my frames to be 960x540 pixels, encoder will pass me 777600 bytes long buffers. When I choose other resolution it will always scale buffers accordingly.
Bitmaps are created during program execution and I render Android Views on them (using Canvas). Bitmap class specification doesn't give me many choices in color formatting and there is no option to choose 12-bit coding as far as I know.
I can however choose 8-bit and this is what I get when I try to fill input buffers with my 8-bit-per-pixel bitmaps' content: https://www.youtube.com/watch?v=5c-fjYp9KMQ
As you can see everything is greenish and it shouldn't be. Here is minimal (2 classes) working code example exposing this issue: https://github.com/eeprojects/MediaCodecExample
You can generate video shown above just by running this application and waiting a few seconds. Everything app is doing in runtime is documented in LogCat.
I tried setting buffer size manually (which is possible by setting MediaFormat.KEY_MAX_INPUT_SIZE
field) and set it to resWidth * resHeight * 2
and then coded bitmaps in 16-bit but while trying to dequeue output buffer in this way, codec returns fatal internal error and application crashes:
E/ACodec: [OMX.Exynos.AVC.Encoder] ERROR(0x80001001)
E/ACodec: signalError(omxError 0x80001001, internalError -2147483648)
E/MediaCodec: Codec reported err 0x80001001, actionCode 0, while in state 6
Upvotes: 0
Views: 1562
Reputation: 13317
You don't need to guess blindly what format the encoder wants - you actually choose it yourself in your application code. From your MainActivity.java
:
MediaCodecInfo codecInfo = selectCodec(OUTPUT_VIDEO_MIME_TYPE);
int colorFormat = selectColorFormat(codecInfo, OUTPUT_VIDEO_MIME_TYPE);
MediaFormat outputVideoFormat =
MediaFormat.createVideoFormat(OUTPUT_VIDEO_MIME_TYPE, TEX_WIDTH, TEX_HEIGHT);
outputVideoFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, colorFormat);
The selectColorFormat
method probably returned some YUV 420 color format. All the common YUV 420 color formats (e.g. MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar
) are 12 bit. Or more precisely, you have a 8 bit luminance plane in full resolution, followed by two chrominance planes with 8 bit components, but subsampled 2x both horizontally and vertically.
When you draw this using Bitmap.Config.ALPHA_8
, you only set the luminance plane, while the chrominance planes are left uninitialized, probably set to zero, giving the greenish color. If you'd set the rest of the bytes of the input buffer to 128 instead of 0, you'd get a grayscale image.
Since Bitmap
doesn't support YUV pixel formats, you either need to do manual conversion of the pixel data, or use the new Surface
input method available since Android 4.3. Then you can use whatever can draw into a Surface
to produce the input - you can at least use OpenGL ES, not sure about Canvas
though.
Upvotes: 1