Reputation: 1141
I am attempting to capture a still image on my Jailbroken iPhone through a console.
Disclaimer: I am a objective-c newbie - most of this code is from: AVCaptureStillImageOutput never calls completition handler
I just want to know if it can work in a terminal.
I have tried various examples on SO and none seem to work. My question is - Is it even possible? or does Apple lock this down for some reason (accessing the camera through the terminal might not be allowed etc).
Here is the my current camera.m:
#import <Foundation/Foundation.h>
#import <CoreVideo/CoreVideo.h>
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVFoundation.h>
#import <ImageIO/CGImageProperties.h>
#import <stdio.h>
#import <UIKit/UIKit.h>
@interface Camera : NSObject {
}
@property (readwrite, retain) AVCaptureStillImageOutput *stillImageOutput;
- (AVCaptureDevice *)frontFacingCameraIfAvailable;
- (void)setupCaptureSession;
- (void)captureNow;
@end
@implementation Camera
@synthesize stillImageOutput;
-(AVCaptureDevice *) frontFacingCameraIfAvailable{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices){
if (device.position == AVCaptureDevicePositionFront){
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if (!captureDevice){
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
-(void) setupCaptureSession {
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
//[self.view.layer addSublayer:captureVideoPreviewLayer];
NSError *error = nil;
AVCaptureDevice *device = [self frontFacingCameraIfAvailable];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(@"ERROR: trying to open camera: %@", error);
}
[session addInput:input];
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
[session addOutput:self.stillImageOutput];
[session startRunning];
}
-(void) captureNow {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(@"about to request a capture from: %@", self.stillImageOutput);
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSLog(@"image captured: %@", [imageData bytes]);
}];
}
@end
int main(int argc, const char *argv[]) {
Camera *camera = [[Camera alloc] init];
[camera setupCaptureSession];
[camera captureNow];
}
Here is my Makefile:
XCODE_BASE=/Applications/Xcode.app/Contents
IPHONEOS=$(XCODE_BASE)/Developer/Platforms/iPhoneOS.platform
SDK=$(IPHONEOS)/Developer/SDKs/iPhoneOS7.1.sdk
FRAMEWORKS=$(SDK)/System/Library/Frameworks/
INCLUDES=$(SDK)/usr/include
camera:
clang -mios-version-min=7.0 \
-isysroot $(SDK) \
-arch armv7 \
camera.m \
-lobjc \
-framework Foundation -framework AVFoundation -framework CoreVideo -framework CoreMedia -framework CoreGraphics -framework CoreImage -framework UIKit -framework ImageIO \
-o camera
Copy camera binary to iPhone:
scp camera [email protected]:/var/root/camera
[email protected]'s password:
camera 100% 51KB 51.0KB/s 00:00
iPhone result of running the app:
iPhone:~ root# ./camera
2014-03-21 15:17:55.550 camera[9483:507] about to request a capture from: <AVCaptureStillImageOutput: 0x14e2b940>
As you can see, the AVCaptureSession gets setup correctly, the frontal camera is found and set, The AVCaptureConnection gets set correctly. However captureStillImageAsynchronouslyFromConnection:completionHandler: never gets called.
Is this even possible to do? Am I doing something wrong?
EDIT: I have modified captureNow to sleep on the thread until completionHandler completes and the app just waits forever after starting.
-(void) captureNow {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
__block BOOL done = NO;
NSLog(@"about to request a capture from: %@", self.stillImageOutput);
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSLog(@"image captured");
done = YES;
}];
while (!done) {
[NSThread sleepForTimeInterval:1.0];
}
}
Upvotes: 0
Views: 324
Reputation: 61
Your app exited before capture photo. Needs add completionBlock arg to -captureNow and in main implement some like this:
__block BOOL done = NO;
[camera captureWithCompletionBlock:^(UIImage* image)
{
done = YES;
}];
while (!done)
{
[[NSRunLoop mainRunLoop] runUntilDate:[NSDate dateWithTimeIntervalSinceNow:0.1]];
}
I mean some like this -
- (void)captureWithCompletionBlock:(void(^)(UIImage* image))block {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(@"about to request a capture from: %@", self.stillImageOutput);
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
NSLog(@"image captured: %@", [imageData bytes]);
block(image);
}];
}
Also you need retain AVCaptureSession, just add it's as property.
@property (nonatomic,strong) AVCaptureSession* session;
Upvotes: 1