Reputation: 1096
Currently i am working with a drawing application based on GLpaint. Saving the current screen becomes a hectic pain for me . I have a ViewController, On the top of the view controller i have loaded my UIIMageView and UIView (PaintingView). Now its seems like i am drawing on the top of the UIImageView.
I have managed to get my current drawing with this question GLPaint save image. When i try to capture my current drawing i get my drawing but with a black screen . What i desired is my drawing with the background image (UIImageView) . Should i overlay the UIView with UIImageView ?
Upvotes: 5
Views: 453
Reputation: 1254
I use this code to grab my image from OpenGL:
-(BOOL)iPhoneRetina{
return ([[UIScreen mainScreen] respondsToSelector:@selector(displayLinkWithTarget:selector:)] && ([UIScreen mainScreen].scale == 2.0))?YES:NO;
}
void releasePixels(void *info, const void *data, size_t size) {
free((void*)data);
}
-(UIImage *) glToUIImage{
int imageWidth, imageHeight;
int scale = [self iPhoneRetina]?2:1;
imageWidth = self.frame.size.width*scale;
imageHeight = self.frame.size.height*scale;
NSInteger myDataLength = imageWidth * imageHeight * 4;
// allocate array and read pixels into it.
GLubyte *buffer = (GLubyte *) malloc(myDataLength);
glReadPixels(0, 0, imageWidth, imageHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
// make data provider with data.
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer, myDataLength, releasePixels);
// prep the ingredients
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = 4 * imageWidth;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGImageAlphaPremultipliedLast;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
// make the cgimage
CGImageRef imageRef = CGImageCreate(imageWidth, imageHeight, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
UIImage *myImage = [UIImage imageWithCGImage:imageRef scale:scale orientation:UIImageOrientationDownMirrored]; //Render image flipped, since OpenGL's data is mirrored
CGImageRelease(imageRef);
CGColorSpaceRelease(colorSpaceRef);
CGDataProviderRelease(provider);
return myImage;
}
And this one to merge it with a background image:
-(UIImage*)mergeImage:(UIImage*)image1 withImage:(UIImage*)image2{
CGSize size = image1.size;
UIGraphicsBeginImageContextWithOptions(size, NO, 0);
[image1 drawAtPoint:CGPointMake(0.0f, 0.0f)];
[image2 drawAtPoint:CGPointMake(0.0f, 0.0f)];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return result;
}
Something like this:
finalImage = [self mergeImage:BackgroundImage withImage[self glToUIImage]];
Upvotes: 0
Reputation: 635
You should load your image by using OpenGL, not UIKit (as UIImageView). Otherwise, you only will be able to capture the OpenGLView as an individual image, as well as the UIKit View as a different image.
To do this, you have to render your image in a Texture in the PaintingView class provided in GLpaint example and then load it by drawing a quad over your drawing view.
Upvotes: 2