Reputation: 41
I'm trying to make an app that records the contents of a UIImageView along with microphone audio and record them to a video in realtime. (Kinda like the Talking Tom Cat app)
I'm using AVAssetWriterInputPixelBufferAdaptor to record the contents of the UIImageView without trouble, but I have simply no idea how to incorporate audio with this. And it's not from a lack of trying either, I've worked on this specific problem for better than a week combing this website, google, iphone developer forumns, etc. This just isn't my cup of tea.
The closest reference to this is: How do I export UIImage array as a movie?
I can capture the video and audio separately, and then encode them together after the fact (I really have tried hard to solve this problem), but getting this app to record in real time is really ideal.
Here is some framework code that demonstrates the recording I have so far:
Here is the .h File
//
// RecordingTestProjectViewController.h
// RecordingTestProject
//
// Created by Sean Luck on 7/26/11.
// Copyright 2011 __MyCompanyName__. All rights reserved.
//
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <CoreGraphics/CoreGraphics.h>
#import <CoreMedia/CoreMedia.h>
#import <CoreVideo/CoreVideo.h>
#import <QuartzCore/QuartzCore.h>
@interface RecordingTestProjectViewController : UIViewController
{
UIImageView *testView;
NSTimer *theTimer;
NSTimer *assetWriterTimer;
AVMutableComposition *mutableComposition;
AVAssetWriter *assetWriter;
AVAssetWriterInput *assetWriterInput;
AVAssetWriterInput *_audioWriterInput;
AVCaptureDeviceInput *audioInput;
AVCaptureSession *_capSession;
AVAssetWriterInputPixelBufferAdaptor *assetWriterPixelBufferAdaptor;
CFAbsoluteTime firstFrameWallClockTime;
int count;
}
-(void) writeSample: (NSTimer*) _timer;
-(void) startRecording;
-(void) pauseRecording;
-(void) stopRecording;
-(NSString*) pathToDocumentsDirectory;
@end
And the .m File
//
// RecordingTestProjectViewController.m
// RecordingTestProject
//
// Created by Sean Luck on 7/26/11.
// Copyright 2011 __MyCompanyName__. All rights reserved.
//
#import "RecordingTestProjectViewController.h"
#import <AVFoundation/AVFoundation.h>
#import <CoreGraphics/CoreGraphics.h>
#import <CoreMedia/CoreMedia.h>
#import <CoreVideo/CoreVideo.h>
#import <QuartzCore/QuartzCore.h>
#define OUTPUT_FILE_NAME @"screen.mov"
#define TIME_SCALE 600
@implementation RecordingTestProjectViewController
- (void)dealloc
{
[super dealloc];
}
- (void)didReceiveMemoryWarning
{
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
#pragma mark - View lifecycle
// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
- (void)viewDidLoad
{
testView = [[UIImageView alloc] initWithImage:nil];
testView.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
testView.userInteractionEnabled=NO;
testView.backgroundColor = [UIColor lightGrayColor];
[self.view addSubview:testView];
[testView release];
[super viewDidLoad];
[self startRecording];
}
-(void) writeSample: (NSTimer*) _timer
{
if ([assetWriterInput isReadyForMoreMediaData])
{
NSLog(@"count=%.i",count);
count=count+10;
UIGraphicsBeginImageContext(testView.frame.size);
[testView.image drawInRect:CGRectMake(0, 0, testView.frame.size.width, testView.frame.size.height)];
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeNormal);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0,1,1,1);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 5);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), 10+count/2, 10+count);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), 20+count*2, 20+count);
CGContextStrokePath(UIGraphicsGetCurrentContext());
testView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CVReturn cvErr = kCVReturnSuccess;
CGImageRef image = (CGImageRef) [testView.image CGImage];
// prepare the pixel buffer
CVPixelBufferRef pixelBuffer = NULL;
CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,testView.frame.size.width,testView.frame.size.height,kCVPixelFormatType_32BGRA,(void*)CFDataGetBytePtr(imageData),CGImageGetBytesPerRow(image),NULL,NULL,NULL,&pixelBuffer);
// calculate the time
CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
CMTime presentationTime = CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);
if (!cvErr)
{
// write the sample
BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];
if (appended)
{
}
else
{
NSLog (@"failed to append");
[self stopRecording];
}
}
CVPixelBufferRelease(pixelBuffer);
CFRelease(imageData);
}
if (count>1000)
{
[self stopRecording];
}
}
-(void) startRecording
{
// Doesn't record audio at all. Needs to be implemented.
// create the AVAssetWriter
NSString *moviePath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent:OUTPUT_FILE_NAME];
if ([[NSFileManager defaultManager] fileExistsAtPath:moviePath])
{
[[NSFileManager defaultManager] removeItemAtPath:moviePath error:nil];
}
NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
NSError *movieError = nil;
[assetWriter release];
assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL fileType: AVFileTypeQuickTimeMovie error: &movieError];
NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey,[NSNumber numberWithInt:testView.frame.size.width], AVVideoWidthKey,[NSNumber numberWithInt:testView.frame.size.height], AVVideoHeightKey,nil];
[assetWriterInput release];
assetWriterInput =[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:assetWriterInputSettings];
assetWriterInput.expectsMediaDataInRealTime = YES;
[assetWriter addInput:assetWriterInput];
[assetWriterPixelBufferAdaptor release];
assetWriterPixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:assetWriterInput sourcePixelBufferAttributes:nil];
[assetWriter startWriting];
firstFrameWallClockTime = CFAbsoluteTimeGetCurrent();
[assetWriter startSessionAtSourceTime: CMTimeMake(0, TIME_SCALE)];
// start writing samples to it
[assetWriterTimer release];
assetWriterTimer = [NSTimer scheduledTimerWithTimeInterval:0.2 target:self selector:@selector (writeSample:) userInfo:nil repeats:YES];
[movieURL release];
[assetWriterInputSettings release];
}
-(void) stopRecording
{
if (assetWriterTimer!=nil)
{
[assetWriterTimer invalidate];
assetWriterTimer = nil;
[assetWriter finishWriting];
NSString *moviePath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent:OUTPUT_FILE_NAME];
UISaveVideoAtPathToSavedPhotosAlbum (moviePath, nil, nil, nil);
}
}
-(void) pauseRecording
{
// needs to be implemented
}
-(NSString*) pathToDocumentsDirectory
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
return documentsDirectory;
}
- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
// Return YES for supported orientations
return YES;
}
@end
So that's that. If anyone can modify this code to record microphone audio I'm sure there are many people besides me that would appreciate it. Once the audio is added this would make a straight forward framework to do in app screen cast for developers to demo their apps.
FYI: My code is open source, largely modeled after the VTMScreenRecorderTest project from http://www.subfurther.com/blog/2011/04/12/voices-that-matter-iphone-spring-2011/ . I made small-moderate modifications and cleaned up memory leaks.
Upvotes: 4
Views: 3092
Reputation: 11
I just discovered that the CMTimestamps must be in sync if you want to use the microphone along with recorded video. I tried to synchronize them by changing the sample time on the audio using CMSampleBufferSetOutputPresentationTimeStamp...but i'm missing something or it doesn't work. If you trap the time that your microphone begins and add it as an offset to your video, each of the samples for AVAssetWriter here on StackOverflow seem to work fine-
Here is what i'm doing to keep video in sync with the Microphone:
Img = Image
Snd = Sound
Avmedia = AVAssetWriter
etc.
#define Cmtm__Pref_Timescale_uk 10000
CMTime Duration_cmtm = CMTimeMakeWithSeconds( Avmedia_v->Img_Duration__Sec_df, Cmtm__Pref_Timescale_uk );
Avmedia_v->Img_cmtm = CMTimeAdd( Duration_cmtm, Avmedia_v->Exprt__Begin_cmtm );
if( [ Avmedia_v->Clrbuf__Adaptor_v appendPixelBuffer: Clrbuf_v withPresentationTime:
Avmedia_v->Img_cmtm ] is no )
{
//If the operation was unsuccessful,
//invoke the AVAssetWriter object’s finishWriting method in order to save a partially completed asset.
[ Avmedia_v->Avwriter_v finishWriting ];
CLog( Lv_Minor, "Movie Img exprt failed" );
}
Upvotes: 1