user2118853
user2118853

Reputation:

How to change orientation of captured byte[] frames through onPreviewFrame callback?

I have tried to search for this question a lot, but never have seen any satisfactory answers, so now I have a last hope here.

I have an onPreviewFrame callback set up. Which gives a byte[] of raw frames with supported preview format(NV21 with H.264 encoded type).

Now, the problem is callback always starts giving byte[] frames from a fixed orientation, whenever device rotates it doesn't reflect to captured byte[] frames. I have tried with setDisplayOrientation and setRotation but these api's are only reflecting to preview which is being displayed not at all to the captured byte [] frames.

Android docs even says, Camera.setDisplayOrientation only affects the displaying preview, not the frame bytes:

This does not affect the order of byte array passed in onPreviewFrame(byte[], Camera), JPEG pictures, or recorded videos.

Finally Is there a way, at any API level, to change the orientation of the byte[] frames?

Upvotes: 9

Views: 7189

Answers (2)

Sheraz Nadeem
Sheraz Nadeem

Reputation: 311

I have modified the onPreviewFrame method of this Open Source Android Touch-To-Record library to take transpose and resize a captured frame.

I defined "yuvIplImage" as following in my setCameraParams() method.

IplImage yuvIplImage = IplImage.create(mPreviewSize.height, mPreviewSize.width, opencv_core.IPL_DEPTH_8U, 2);

This is my onPreviewFrame() method:

@Override
public void onPreviewFrame(byte[] data, Camera camera)
{

    long frameTimeStamp = 0L;

    if(FragmentCamera.mAudioTimestamp == 0L && FragmentCamera.firstTime > 0L)
    {
        frameTimeStamp = 1000L * (System.currentTimeMillis() - FragmentCamera.firstTime);
    }
    else if(FragmentCamera.mLastAudioTimestamp == FragmentCamera.mAudioTimestamp)
    {
        frameTimeStamp = FragmentCamera.mAudioTimestamp + FragmentCamera.frameTime;
    }
    else
    {
        long l2 = (System.nanoTime() - FragmentCamera.mAudioTimeRecorded) / 1000L;
        frameTimeStamp = l2 + FragmentCamera.mAudioTimestamp;
        FragmentCamera.mLastAudioTimestamp = FragmentCamera.mAudioTimestamp;
    }

    synchronized(FragmentCamera.mVideoRecordLock)
    {
        if(FragmentCamera.recording && FragmentCamera.rec && lastSavedframe != null && lastSavedframe.getFrameBytesData() != null && yuvIplImage != null)
        {
            FragmentCamera.mVideoTimestamp += FragmentCamera.frameTime;

            if(lastSavedframe.getTimeStamp() > FragmentCamera.mVideoTimestamp)
            {
                FragmentCamera.mVideoTimestamp = lastSavedframe.getTimeStamp();
            }

            try
            {
                yuvIplImage.getByteBuffer().put(lastSavedframe.getFrameBytesData());

                IplImage bgrImage = IplImage.create(mPreviewSize.width, mPreviewSize.height, opencv_core.IPL_DEPTH_8U, 4);// In my case, mPreviewSize.width = 1280 and mPreviewSize.height = 720
                IplImage transposed = IplImage.create(mPreviewSize.height, mPreviewSize.width, yuvIplImage.depth(), 4);
                IplImage squared = IplImage.create(mPreviewSize.height, mPreviewSize.height, yuvIplImage.depth(), 4);

                int[] _temp = new int[mPreviewSize.width * mPreviewSize.height];

                Util.YUV_NV21_TO_BGR(_temp, data, mPreviewSize.width,  mPreviewSize.height);

                bgrImage.getIntBuffer().put(_temp);

                opencv_core.cvTranspose(bgrImage, transposed);
                opencv_core.cvFlip(transposed, transposed, 1);

                opencv_core.cvSetImageROI(transposed, opencv_core.cvRect(0, 0, mPreviewSize.height, mPreviewSize.height));
                opencv_core.cvCopy(transposed, squared, null);
                opencv_core.cvResetImageROI(transposed);

                videoRecorder.setTimestamp(lastSavedframe.getTimeStamp());
                videoRecorder.record(squared);
            }
            catch(com.googlecode.javacv.FrameRecorder.Exception e)
            {
                e.printStackTrace();
            }
        }

        lastSavedframe = new SavedFrames(data, frameTimeStamp);
    }
}

This code uses a method "YUV_NV21_TO_BGR", which I found from this link

Basically this method is used to resolve, which I call as, "The Green Devil problem on Android". You can see other android devs facing the same problem on other SO threads. Before adding "YUV_NV21_TO_BGR" method when I just took transpose of YuvIplImage, more importantly a combination of transpose, flip (with or without resizing), there was greenish output in resulting video. This "YUV_NV21_TO_BGR" method saved the day. Thanks to @David Han from above google groups thread.

Also you should know that all this processing (transpose, flip and resize), in onPreviewFrame, takes much time which causes you a very serious hit on your Frames Per Second (FPS) rate. When I used this code, inside onPreviewFrame method, the resulting FPS of the recorded video was down to 3 frames/sec from 30fps.

I would advise not to use this approach. Rather you can go for post-recording processing (transpose, flip and resize) of your video file using JavaCV in an AsyncTask. Hope this helps.

Upvotes: 1

El Bert
El Bert

Reputation: 3026

One possible way if you don't care about the format is to the use YuvImage class to get a JPEG buffer, use this buffer to create a Bitmap and rotate it to the corresponding angle. Something like that:

@Override
public void onPreviewFrame(byte[] data, Camera camera) {

    Size previewSize = camera.getParameters().getPreviewSize();
    ByteArrayOutputStream baos = new ByteArrayOutputStream();
    byte[] rawImage = null;

    // Decode image from the retrieved buffer to JPEG
    YuvImage yuv = new YuvImage(data, ImageFormat.NV21, previewSize.width, previewSize.height, null);
    yuv.compressToJpeg(new Rect(0, 0, previewSize.width, previewSize.height), YOUR_JPEG_COMPRESSION, baos);
    rawImage = baos.toByteArray();

    // This is the same image as the preview but in JPEG and not rotated
    Bitmap bitmap = BitmapFactory.decodeByteArray(rawImage, 0, rawImage.length);
    ByteArrayOutputStream rotatedStream = new ByteArrayOutputStream();

    // Rotate the Bitmap
    Matrix matrix = new Matrix();
    matrix.postRotate(YOUR_DEFAULT_ROTATION);

    // We rotate the same Bitmap
    bitmap = Bitmap.createBitmap(bitmap, 0, 0, previewSize.width, previewSize.height, matrix, false);

    // We dump the rotated Bitmap to the stream 
    bitmap.compress(CompressFormat.JPEG, YOUR_JPEG_COMPRESSION, rotatedStream);

    rawImage = rotatedStream.toByteArray();

    // Do something we this byte array
}

Upvotes: 2

Related Questions