qq456cvb
qq456cvb

Reputation: 167

ios opencv's callback "processImage" resolution not matched with the ImageView

I have used opencv3 on iOS, I use the following code to capture video and process the images

videoCamera = [[CvVideoCamera alloc] initWithParentView:_imageView];
videoCamera.defaultAVCaptureDevicePosition = AVCaptureDevicePositionBack;
videoCamera.delegate = self;
videoCamera.defaultAVCaptureSessionPreset = AVCaptureSessionPreset640x480;
videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationPortrait;
videoCamera.defaultFPS = 30;

and the callback

- (void)processImage:(cv::Mat &)image {}

But the image I get is with the dimensions of 640 rows and 480 cols, which is strange. Because if I fit the image in a '640-width * 480-height' ImageView, it perfectly fits. The image mat then should be 480 rows * 640 cols since opencv's mat is in row major. I need to process it as a 480 * 640 mat, any solutions?

I also tried to transpose it but it looks strange when showed on the ImageView and maybe opencv's internal has rotated the mat implicitly?

Upvotes: 1

Views: 500

Answers (1)

James Bush
James Bush

Reputation: 1527

Although it looks like you fixed it, I sense you're going to have similar problems down the line. Here's how to avoid all of them (I'm just pasting code from another answer that happens to also sold to situation):

"You weren't specific as to whether the window in question was a view or layer, or whether it was live video or saved to a file. You also didn't specify whether the video was recorded via OpenCV or if it was recorded by another means.

So, I included code snippets for every contingency; if you're familiar with OpenCV and the basics of view programming for iOS, it should be obvious what you should use (in my case, by the way, I use all of it):

- (void)viewDidLayoutSubviews {
    [super viewDidLayoutSubviews];

    switch ([UIDevice currentDevice].orientation) {
        case UIDeviceOrientationPortraitUpsideDown:
            self.videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationPortraitUpsideDown;
            break;
        case UIDeviceOrientationLandscapeLeft:
            self.videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationLandscapeLeft;
            break;
        case UIDeviceOrientationLandscapeRight:
            self.videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationLandscapeRight;
            break;
        default:
            self.videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationPortrait;
            break;
    }

    [self refresh];
}

- (void)processImage:(cv::Mat &)mat {

    if (self.videoCamera.running) {
        switch (self.videoCamera.defaultAVCaptureVideoOrientation) {
            case AVCaptureVideoOrientationLandscapeLeft:
            case AVCaptureVideoOrientationLandscapeRight:
                // The landscape video is captured upside-down.
                // Rotate it by 180 degrees.
                cv::flip(mat, mat, -1);
                break;
            default:
                break;
        }
    }


- (void)convertBlendSrcMatToWidth:(int)dstW height:(int)dstH {

    double dstAspectRatio = dstW / (double)dstH;

    int srcW = originalBlendSrcMat.cols;
    int srcH = originalBlendSrcMat.rows;
    double srcAspectRatio = srcW / (double)srcH;
    cv::Mat subMat;
    if (srcAspectRatio < dstAspectRatio) {
        int subMatH = (int)(srcW / dstAspectRatio);
        int startRow = (srcH - subMatH) / 2;
        int endRow = startRow + subMatH;
        subMat = originalBlendSrcMat.rowRange(startRow, endRow);
    } else {
        int subMatW = (int)(srcH * dstAspectRatio);
        int startCol = (srcW - subMatW) / 2;
        int endCol = startCol + subMatW;
        subMat = originalBlendSrcMat.colRange(startCol, endCol);
    }
    cv::resize(subMat, convertedBlendSrcMat, cv::Size(dstW, dstH), 0.0, 0.0, cv::INTER_LANCZOS4);



- (int)imageWidth {
    AVCaptureVideoDataOutput *output = [self.captureSession.outputs lastObject];
    NSDictionary *videoSettings = [output videoSettings];
    int videoWidth = [[videoSettings objectForKey:@"Width"] intValue];
    return videoWidth;
}

- (int)imageHeight {
    AVCaptureVideoDataOutput *output = [self.captureSession.outputs lastObject];
    NSDictionary *videoSettings = [output videoSettings];
    int videoHeight = [[videoSettings objectForKey:@"Height"] intValue];
    return videoHeight;
}

- (void)updateSize {
    // Do nothing.
}

- (void)layoutPreviewLayer {
    if (self.parentView != nil) {

        // Center the video preview.
        self.customPreviewLayer.position = CGPointMake(0.5 * self.parentView.frame.size.width, 0.5 * self.parentView.frame.size.height);

        // Find the video's aspect ratio.
        CGFloat videoAspectRatio = self.imageWidth / (CGFloat)self.imageHeight;

        // Scale the video preview while maintaining its aspect ratio.
        CGFloat boundsW;
        CGFloat boundsH;
        if (self.imageHeight > self.imageWidth) {
            if (self.letterboxPreview) {
                boundsH = self.parentView.frame.size.height;
                boundsW = boundsH * videoAspectRatio;
            } else {
                boundsW = self.parentView.frame.size.width;
                boundsH = boundsW / videoAspectRatio;
            }
        } else {
            if (self.letterboxPreview) {
                boundsW = self.parentView.frame.size.width;
                boundsH = boundsW / videoAspectRatio;
            } else {
                boundsH = self.parentView.frame.size.height;
                boundsW = boundsH * videoAspectRatio;
            }
        }
        self.customPreviewLayer.bounds = CGRectMake(0.0, 0.0, boundsW, boundsH);
    }
}


- (void)processImage:(cv::Mat &)mat {

    if (self.videoCamera.running) {
        switch (self.videoCamera.defaultAVCaptureVideoOrientation) {
            case AVCaptureVideoOrientationLandscapeLeft:
            case AVCaptureVideoOrientationLandscapeRight:
                // The landscape video is captured upside-down.
                // Rotate it by 180 degrees.
                cv::flip(mat, mat, -1);
                break;
            default:
                break;
        }
    }


- (void)convertBlendSrcMatToWidth:(int)dstW height:(int)dstH {

    double dstAspectRatio = dstW / (double)dstH;

    int srcW = originalBlendSrcMat.cols;
    int srcH = originalBlendSrcMat.rows;
    double srcAspectRatio = srcW / (double)srcH;
    cv::Mat subMat;
    if (srcAspectRatio < dstAspectRatio) {
        int subMatH = (int)(srcW / dstAspectRatio);
        int startRow = (srcH - subMatH) / 2;
        int endRow = startRow + subMatH;
        subMat = originalBlendSrcMat.rowRange(startRow, endRow);
    } else {
        int subMatW = (int)(srcH * dstAspectRatio);
        int startCol = (srcW - subMatW) / 2;
        int endCol = startCol + subMatW;
        subMat = originalBlendSrcMat.colRange(startCol, endCol);
    }
    cv::resize(subMat, convertedBlendSrcMat, cv::Size(dstW, dstH), 0.0, 0.0, cv::INTER_LANCZOS4);



- (int)imageWidth {
    AVCaptureVideoDataOutput *output = [self.captureSession.outputs lastObject];
    NSDictionary *videoSettings = [output videoSettings];
    int videoWidth = [[videoSettings objectForKey:@"Width"] intValue];
    return videoWidth;
}

- (int)imageHeight {
    AVCaptureVideoDataOutput *output = [self.captureSession.outputs lastObject];
    NSDictionary *videoSettings = [output videoSettings];
    int videoHeight = [[videoSettings objectForKey:@"Height"] intValue];
    return videoHeight;
}

- (void)updateSize {
    // Do nothing.
}

- (void)layoutPreviewLayer {
    if (self.parentView != nil) {

        // Center the video preview.
        self.customPreviewLayer.position = CGPointMake(0.5 * self.parentView.frame.size.width, 0.5 * self.parentView.frame.size.height);

        // Find the video's aspect ratio.
        CGFloat videoAspectRatio = self.imageWidth / (CGFloat)self.imageHeight;

        // Scale the video preview while maintaining its aspect ratio.
        CGFloat boundsW;
        CGFloat boundsH;
        if (self.imageHeight > self.imageWidth) {
            if (self.letterboxPreview) {
                boundsH = self.parentView.frame.size.height;
                boundsW = boundsH * videoAspectRatio;
            } else {
                boundsW = self.parentView.frame.size.width;
                boundsH = boundsW / videoAspectRatio;
            }
        } else {
            if (self.letterboxPreview) {
                boundsW = self.parentView.frame.size.width;
                boundsH = boundsW / videoAspectRatio;
            } else {
                boundsH = self.parentView.frame.size.height;
                boundsW = boundsH * videoAspectRatio;
            }
        }
        self.customPreviewLayer.bounds = CGRectMake(0.0, 0.0, boundsW, boundsH);
    }
}

There's a lot here, and you have to know where to put it. If you can't figure it out, let me know.

Upvotes: 0

Related Questions