Wonka Gollum
Wonka Gollum

Reputation: 89

Using AVAssetWriter and AVAssetReader for Audio Recording and Playback

My app uses AVAssetReader to play songs in the iPOD library. Now I want to add the audio recording capability.

I recorded audio using AVAssetWriter. I checked the resulting audio file(MPEG4AAC format) by playing it back successfully using AVAudioPlayer. My goal is to play back audio using AVAssetReader. But when I create an AVURLAsset for the file, it has no track and hence AVAssetReader fails (error code: -11828 File Format Not Recognized).

What should I do to make AVAsset recognize the file format? Is there some special file format required for AVAsset?

Here are codes for recording:

void setup_ASBD(void *f, double fs, int sel, int numChannels);
static AVAssetWriter *assetWriter = NULL;
static AVAssetWriterInput *assetWriterInput = NULL;
static CMAudioFormatDescriptionRef formatDesc;
AVAssetWriter *newAssetWriter(NSURL *url) {
    NSError *outError;
    assetWriter = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeAppleM4A error:&outError];

    if(assetWriter == nil) {
        NSLog(@"%s: asset=%x, %@\n", __FUNCTION__, (int)assetWriter, outError);
        return assetWriter;
    }

    AudioChannelLayout audioChannelLayout = {
        .mChannelLayoutTag = kAudioChannelLayoutTag_Mono,
        .mChannelBitmap = 0,
        .mNumberChannelDescriptions = 0
    };

    // Convert the channel layout object to an NSData object.
    NSData *channelLayoutAsData = [NSData dataWithBytes:&audioChannelLayout length:offsetof(AudioChannelLayout, mChannelDescriptions)];

    // Get the compression settings for 128 kbps AAC.
    NSDictionary *compressionAudioSettings = @{
                                               AVFormatIDKey         : [NSNumber numberWithUnsignedInt:kAudioFormatMPEG4AAC],
                                               AVEncoderBitRateKey   : [NSNumber numberWithInteger:128000],
                                               AVSampleRateKey       : [NSNumber numberWithInteger:44100],
                                               AVChannelLayoutKey    : channelLayoutAsData,
                                               AVNumberOfChannelsKey : [NSNumber numberWithUnsignedInteger:1]
                                               };

    // Create the asset writer input with the compression settings and specify the media type as audio.
    assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:compressionAudioSettings];
    assetWriterInput.expectsMediaDataInRealTime = YES;

    // Add the input to the writer if possible.
    if (assetWriterInput != NULL && [assetWriter canAddInput:assetWriterInput]) {
        [assetWriter addInput:assetWriterInput];
    }
    else {
        NSLog(@"%s:assetWriteInput problem: %x\n", __FUNCTION__, (int)assetWriterInput);
        return NULL;
    }

    [assetWriter startWriting];
    // Start a sample-writing session.
    [assetWriter startSessionAtSourceTime:kCMTimeZero];

    if(assetWriter.status != AVAssetWriterStatusWriting) {
        NSLog(@"%s: Bad writer status=%d\n", __FUNCTION__, (int)assetWriter.status);
        return NULL;
    }

    AudioStreamBasicDescription ASBD;
    setup_ASBD(&ASBD, 44100, 2, 1);
    CMAudioFormatDescriptionCreate (NULL, &ASBD, sizeof(audioChannelLayout), &audioChannelLayout, 0, NULL, NULL, &formatDesc);
    //CMAudioFormatDescriptionCreate (NULL, &ASBD, 0, NULL, 0, NULL, NULL, &formatDesc);

    return assetWriter;

}

static int sampleCnt = 0;
void writeNewSamples(void *buffer, int len) {
    if(assetWriterInput == NULL) return;
    if([assetWriterInput isReadyForMoreMediaData]) {
        OSStatus result;
        CMBlockBufferRef blockBuffer = NULL;
        result = CMBlockBufferCreateWithMemoryBlock (NULL, buffer, len, NULL, NULL, 0, len, 0, &blockBuffer);
        if(result == noErr) {
            CMItemCount numSamples = len >> 1;

            const CMSampleTimingInfo sampleTiming = {CMTimeMake(1, 44100), CMTimeMake(sampleCnt, 44100), kCMTimeInvalid};
            CMItemCount numSampleTimingEntries = 1;

            const size_t sampleSize = 2;
            CMItemCount numSampleSizeEntries = 1;

            CMSampleBufferRef sampleBuffer;
            result = CMSampleBufferCreate(NULL, blockBuffer, true, NULL, NULL, formatDesc, numSamples, numSampleTimingEntries, &sampleTiming, numSampleSizeEntries, &sampleSize, &sampleBuffer);

            if(result == noErr) {
                if([assetWriterInput appendSampleBuffer:sampleBuffer] == YES) sampleCnt += numSamples;
                else {
                    NSLog(@"%s: ERROR\n", __FUNCTION__);
                }
                printf("sampleCnt = %d\n", sampleCnt);
                CFRelease(sampleBuffer);

            }
        }
    }
    else {
        NSLog(@"%s: AVAssetWriterInput not taking input data: status=%ld\n", __FUNCTION__, assetWriter.status);
    }
}

void stopAssetWriter(AVAssetWriter *assetWriter) {
    [assetWriterInput markAsFinished];
    [assetWriter finishWritingWithCompletionHandler:^{
        NSLog(@"%s: Done: %ld: %d samples\n", __FUNCTION__, assetWriter.status, sampleCnt);
        sampleCnt = 0;
    }];
    assetWriterInput = NULL;
}

Upvotes: 1

Views: 1747

Answers (1)

Wonka Gollum
Wonka Gollum

Reputation: 89

It turns out that AVAsset expects "valid" file extension. So when the file name does not have one of those common extensions such as *.mp3, *.caf, *.m4a, etc, AVAsset refuses to look at the file header to figure out the media format. On the other hand, AVAudioPlay seems completely indifferent to the file name and figures out the media format by itself by looking at the file header.

This difference does not appear anywhere in Apple doc. and I ended up wasting more than a week on this. Sigh...

Upvotes: 2

Related Questions