saimonx
saimonx

Reputation: 516

OpenGL + Camera View (iPhone)

I have been 2 days trying to show an OpenGLES view over the camera View preview on the iPhone.

The camera preview alone, works. The EAGLView(OpenGLES) alone, works.

The problem is when i try to place the EAGLView over the camera preview.

I am able to place both UIViews at the same time, but the camera preview is always over the EAGLView (wrong!). When i set alpha to 0.5 of the camera preview, i can see both UIViews just as i want, but both are blurred (it's normal).

I have tried [self.view bringSubviewToFront:(EAGLView)], but nothing changes.

The EAGLView is on the IB as a class. The CameraView is added as a subview by code.

Here i put some code, i can upload more if you need it.

Thanks!!!

EAGLView

+ (Class)layerClass {
    return [CAEAGLLayer class];
}


//The GL view is stored in the nib file. When it's unarchived it's sent -initWithCoder:
- (id)initWithCoder:(NSCoder*)coder {


    puntosPintar=(GLfloat*)malloc(sizeof(GLfloat)*8);
    puntosPintar[0] = -0.25f;
    puntosPintar[1] = -1.22f;
    puntosPintar[2] = -0.41f;
    puntosPintar[3] = 0.0f;
    puntosPintar[4] = 0.35f;
    puntosPintar[5] = -1.69f;
    puntosPintar[6] = 0.15f;
    puntosPintar[7] = 0.0f;

    if ((self = [super initWithCoder:coder])) {
        // Get the layer
        CAEAGLLayer *eaglLayer = (CAEAGLLayer *)self.layer;

        eaglLayer.opaque = NO;
        eaglLayer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:NO], kEAGLDrawablePropertyRetainedBacking, kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat, nil];

        context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];

        if (!context || ![EAGLContext setCurrentContext:context]) {
            [self release];
            return nil;
        }


    }
    return self;
}




- (void)drawView {
    const GLubyte squareColors[] = {
        255, 255,   0, 255,
        0,   255, 255, 255,
        0,     0,   0,   0,
        255,   0, 255, 255,
    };

    [EAGLContext setCurrentContext:context];

    glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
    glViewport(0, 0, backingWidth, backingHeight);

    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();
    glOrthof(-1.0f, 1.0f, -1.5f, 1.5f, -1.0f, 1.0f);


    glMatrixMode(GL_MODELVIEW);
    glClearColor(0.0f, 0.0f, 0.0f, 0.0f);    

    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

    glClear(GL_COLOR_BUFFER_BIT);

    glVertexPointer(2, GL_FLOAT, 0, puntosPintar);
    glEnableClientState(GL_VERTEX_ARRAY);

    glColorPointer(4, GL_UNSIGNED_BYTE, 0, squareColors);

    glEnableClientState(GL_COLOR_ARRAY);

    glDrawArrays(GL_TRIANGLE_STRIP, 0, 8);

    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
    [context presentRenderbuffer:GL_RENDERBUFFER_OES];

}


- (void)layoutSubviews {
    [EAGLContext setCurrentContext:context];

    [self destroyFramebuffer];
    [self createFramebuffer];

    [self drawView];


}


- (BOOL)createFramebuffer {



    glGenFramebuffersOES(1, &viewFramebuffer);
    glGenRenderbuffersOES(1, &viewRenderbuffer);

    glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
    [context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(CAEAGLLayer*)self.layer];
    glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);

    glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
    glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
    if (USE_DEPTH_BUFFER) {
        glGenRenderbuffersOES(1, &depthRenderbuffer);
        glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
        glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
        glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
    }



    if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES) {
        NSLog(@"failed to make complete framebuffer object %x", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES));
        return NO;
    }


    return YES;
}

UIViewController where i want to show both Load of the view of the Camera

    [CameraImageHelper startRunning];
    UIView *fafa;
    fafa= [[UIView alloc]initWithFrame:self.view.bounds]; //returns a UIView with the cameraview as a layer of that view. It works well (checked)
    fafa = [CameraImageHelper previewWithBounds:self.view.bounds];
    fafa.alpha=0.5;  //Only way to show both
    [self.view addSubview:fafa];
    [self.view bringSubviewToFront:fafa];

Load of the EAGLView On the .h i have created

IBOutlet EAGLView *openGLVista

In the view did load:

openGLVista=[[EAGLView alloc]init];

enter image description here

CameraImageHelper.h

@interface CameraImageHelper : NSObject <AVCaptureVideoDataOutputSampleBufferDelegate>
{
    AVCaptureSession *session;
}
@property (retain) AVCaptureSession *session;

+ (void) startRunning;
+ (void) stopRunning;

+ (UIView *) previewWithBounds: (CGRect) bounds;
@end

CameraImageHelper.m

- (void) initialize
{
    NSError *error;
    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:&error];
    if (!captureInput)
    {
        NSLog(@"Error: %@", error);
        return;
    }


    self.session = [[[AVCaptureSession alloc] init] autorelease];
    [self.session addInput:captureInput];
}

- (id) init
{
    if (self = [super init]) [self initialize];
    return self;
}

- (UIView *) previewWithBounds: (CGRect) bounds
{
    UIView *view = [[[UIView alloc] initWithFrame:bounds] autorelease];

    AVCaptureVideoPreviewLayer *preview = [AVCaptureVideoPreviewLayer layerWithSession: self.session];
    preview.frame = bounds;
    preview.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [view.layer addSublayer: preview];

    return view;
}

- (void) dealloc
{
    self.session = nil;
    [super dealloc];
}

#pragma mark Class Interface

+ (id) sharedInstance // private
{
    if(!sharedInstance) sharedInstance = [[self alloc] init];
    return sharedInstance;
}

+ (void) startRunning
{
    [[[self sharedInstance] session] startRunning]; 
}

+ (void) stopRunning
{
    [[[self sharedInstance] session] stopRunning];
}


+ (UIView *) previewWithBounds: (CGRect) bounds
{
    return [[self sharedInstance] previewWithBounds: (CGRect) bounds];
}

@end

Upvotes: 3

Views: 3245

Answers (1)

Anomie
Anomie

Reputation: 94834

I see that in IB you are using the EAGLView as the view of the view controller, and in the code snippet you add the preview view as a subview of that view. In other words, your view hierarchy looks something like this:

*- EAGLView
    +- Preview view

Thus, Preview view is always on top of the EAGLView because it is a subview of the EAGLView. If you want to be able to display either one on top of the other, you will instead have to lay things out like this:

*- some generic UIView
    +- EAGLView
    +- Preview view

In other words, in IB you should have a generic UIView bound to the view property, and then drag the EAGLView so it is inside that generic UIView. Then your code for adding the preview view should work right.


BTW, this does not do what you seem to think:

fafa= [[UIView alloc]initWithFrame:self.view.bounds]; //returns a UIView with the cameraview as a layer of that view. It works well (checked)
fafa = [CameraImageHelper previewWithBounds:self.view.bounds];

The first line creates a generic UIView. Then the second throws it away (leaking the memory!), replacing it with the preview view. You should just delete the first line.

Upvotes: 4

Related Questions