Reputation: 4405
I'm trying to achieve something that I thought will be super easy but turns out it's not.
I'm playing with the source code of "react-native-audio" library. But you can assume I'm working natively for the sake of this question.
Here is a reference for the source code I'm playing with.
My goal is simple, I'm using AVAudioRecorder
to record a meeting (should take 30min approximately). In case of an incoming call in the middle of the recording, I'd like my app to be able to "recover" by doing one of the following options:
1) "pause" the record on "incoming call" and "resume" when app is back to foreground.
2) on incoming call - close the current file, and when app is back to foreground start a new recording (part 2) with a new file.
Obviously option (1) is preferred.
Please note that I'm well aware of using AVAudioSessionInterruptionNotification
and use it in my experiments with no luck so far, for example:
- (void) receiveAudioSessionNotification:(NSNotification *) notification
{
if ([notification.name isEqualToString:AVAudioSessionInterruptionNotification]) {
NSLog(@"AVAudioSessionInterruptionNotification");
NSNumber *type = [notification.userInfo valueForKey:AVAudioSessionInterruptionTypeKey];
if ([type isEqualToNumber:[NSNumber numberWithInt:AVAudioSessionInterruptionTypeBegan]]) {
NSLog(@"*** InterruptionTypeBegan");
[self pauseRecording];
} else {
NSLog(@"*** InterruptionTypeEnded");
[_recordSession setActive:YES error:nil];
}
}
}
Please note I will set up a bounty for this question but the only acceptable answer will be for a real world working code, not something which "should work in theory". Many thanks for the help :)
Upvotes: 3
Views: 837
Reputation: 36169
I chose AVAudioEngine
and AVAudioFile
as the solution because the code is brief and AVFoundation's interruption handling is particularly simple (your player/recorder objects are paused, and unpausing reactivates your audio session).
N.B AVAudioFile
doesn't have an explicit close method, instead writing headers and closing the file during dealloc
a choice which regrettably complicates what would otherwise be a simple API.
@interface ViewController ()
@property (nonatomic) AVAudioEngine *audioEngine;
@property AVAudioFile *outputFile;
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
AVAudioSession *session = [AVAudioSession sharedInstance];
NSError *error;
if (![session setCategory:AVAudioSessionCategoryRecord error:&error]) {
NSLog(@"Failed to set session category: %@", error);
}
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(audioInterruptionHandler:) name:AVAudioSessionInterruptionNotification object:nil];
NSURL *outputURL = [[[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask][0] URLByAppendingPathComponent:@"output.aac"];
__block BOOL outputFileInited = NO;
self.audioEngine = [[AVAudioEngine alloc] init];
AVAudioInputNode *inputNode = self.audioEngine.inputNode;
[inputNode installTapOnBus:0 bufferSize:512 format:nil block:^(AVAudioPCMBuffer *buffer, AVAudioTime * when) {
NSError *error;
if (self.outputFile == nil && !outputFileInited) {
NSDictionary *settings = @{
AVFormatIDKey: @(kAudioFormatMPEG4AAC),
AVNumberOfChannelsKey: @(buffer.format.channelCount),
AVSampleRateKey: @(buffer.format.sampleRate)
};
self.outputFile = [[AVAudioFile alloc] initForWriting:outputURL settings:settings error:&error];
if (!self.outputFile) {
NSLog(@"output file error: %@", error);
abort();
}
outputFileInited = YES;
}
if (self.outputFile && ![self.outputFile writeFromBuffer:buffer error:&error]) {
NSLog(@"AVAudioFile write error: %@", error);
}
}];
if (![self.audioEngine startAndReturnError:&error]) {
NSLog(@"engine start error: %@", error);
}
// To stop recording, nil the outputFile at some point in the future.
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(20 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
NSLog(@"Finished");
self.outputFile = nil;
});
}
// https://developer.apple.com/library/archive/documentation/Audio/Conceptual/AudioSessionProgrammingGuide/HandlingAudioInterruptions/HandlingAudioInterruptions.html
- (void)audioInterruptionHandler:(NSNotification *)notification {
NSDictionary *info = notification.userInfo;
AVAudioSessionInterruptionType type = [info[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue];
switch (type) {
case AVAudioSessionInterruptionTypeBegan:
NSLog(@"Begin interruption");
break;
case AVAudioSessionInterruptionTypeEnded:
NSLog(@"End interruption");
// or ignore shouldResume if you're really keen to resume recording
AVAudioSessionInterruptionOptions endOptions = [info[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue];
if (AVAudioSessionInterruptionOptionShouldResume == endOptions) {
NSError *error;
if (![self.audioEngine startAndReturnError:&error]) {
NSLog(@"Error restarting engine: %@", error);
}
}
break;
}
}
@end
N.B. you probably want to enable background audio (and add a NSMicrophoneUsageDescription
string in your Info.plist of course).
Upvotes: 2