eoLithic
eoLithic

Reputation: 883

Error Domain=AVFoundationErrorDomain Code=-11821 "Cannot Decode"

There's a strange behaviour I've found when trying to merge videos with AVFoundation. I'm pretty sure that I've made a mistake somewhere but I'm too blind to see it. My goal is just to merge 4 videos (later there will be crossfade transition between them). Everytime I'm trying to export video I get this error:

Error Domain=AVFoundationErrorDomain Code=-11821 "Cannot Decode" UserInfo=0x7fd94073cc30 {NSLocalizedDescription=Cannot Decode, NSLocalizedFailureReason=The media data could not be decoded. It may be damaged.}

The funniest thing is that if I don't provide AVAssetExportSession with AVMutableVideoComposition, then everything works fine! I can't understand what I'm doing wrong. The source videos are downloaded from youtube and have .mp4 extension. I can play them with MPMoviePlayerController. While checking the source code, please, look carefully at AVMutableVideoComposition. I was testing this code in Xcode 6.0.1 on iOS simulator.

#import "VideoStitcher.h"
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>

@implementation VideoStitcher
{
    VideoStitcherCompletionBlock _completionBlock;
    AVMutableComposition *_composition;
    AVMutableVideoComposition *_videoComposition;
}

- (instancetype)init
{
    self = [super init];
    if (self)
    {
        _composition = [AVMutableComposition composition];
        _videoComposition = [AVMutableVideoComposition videoComposition];
    }
    return self;
}

- (void)compileVideoWithAssets:(NSArray *)assets completion:(VideoStitcherCompletionBlock)completion
{
    _completionBlock = [completion copy];

    if (assets == nil || assets.count < 2)
    {
        // We need at least two video to make a stitch, right?
        NSAssert(NO, @"VideoStitcher: assets parameter is nil or has not enough items in it");
    }
    else
    {
        [self composeAssets:assets];
        if (_composition != nil) // if stitching went good and no errors were found
            [self exportComposition];
    }
}

- (void)composeAssets:(NSArray *)assets
{
    AVMutableCompositionTrack *compositionVideoTrack = [_composition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                     preferredTrackID:kCMPersistentTrackID_Invalid];

    NSError *compositionError = nil;
    CMTime currentTime = kCMTimeZero;
    AVAsset *asset = nil;
    for (int i = (int)assets.count - 1; i >= 0; i--) //For some reason videos are compiled in reverse order. Find the bug later. 06.10.14
    {
        asset = assets[i];
        AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
        BOOL success = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetVideoTrack.timeRange.duration)
                                                      ofTrack:assetVideoTrack
                                                       atTime:currentTime
                                                        error:&compositionError];
        if (success)
        {
            CMTimeAdd(currentTime, asset.duration);
        }
        else
        {
            NSLog(@"VideoStitcher: something went wrong during inserting time range in composition");
            if (compositionError != nil)
            {
                NSLog(@"%@", compositionError);
                _completionBlock(nil, compositionError);
                _composition = nil;
                return;
            }
        }
    }

    AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, _composition.duration);
    videoCompositionInstruction.backgroundColor = [[UIColor redColor] CGColor];
    _videoComposition.instructions = @[videoCompositionInstruction];
    _videoComposition.renderSize = [self calculateOptimalRenderSizeFromAssets:assets];
    _videoComposition.frameDuration = CMTimeMake(1, 600);
}

- (void)exportComposition
{
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:@"testVideo.mov"];
    NSURL *url = [NSURL fileURLWithPath:myPathDocs];


    NSString *filePath = [url path];
    NSFileManager *fileManager = [NSFileManager defaultManager];
    if ([fileManager fileExistsAtPath:filePath]) {
        NSError *error;
        if ([fileManager removeItemAtPath:filePath error:&error] == NO) {
            NSLog(@"removeItemAtPath %@ error:%@", filePath, error);
        }
    }

    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:_composition
                                                                      presetName:AVAssetExportPreset1280x720];
    exporter.outputURL = url;
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;
    exporter.videoComposition = _videoComposition;
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        [self exportDidFinish:exporter];
    }];
}

- (void)exportDidFinish:(AVAssetExportSession*)session
{
    NSLog(@"%li", session.status);
    if (session.status == AVAssetExportSessionStatusCompleted)
    {
        NSURL *outputURL = session.outputURL;

        // time to call delegate methods, but for testing purposes we save the video in 'photos' app

        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
        if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL])
        {
            [library writeVideoAtPathToSavedPhotosAlbum:outputURL completionBlock:^(NSURL *assetURL, NSError *error){
                if (error == nil)
                {
                    NSLog(@"successfully saved video");
                }
                else
                {
                    NSLog(@"saving video failed.\n%@", error);
                }
            }];
        }
    }
    else if (session.status == AVAssetExportSessionStatusFailed)
    {
        NSLog(@"VideoStitcher: exporting failed.\n%@", session.error);
    }
}

- (CGSize)calculateOptimalRenderSizeFromAssets:(NSArray *)assets
{
    AVAsset *firstAsset = assets[0];
    AVAssetTrack *firstAssetVideoTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] firstObject];
    CGFloat maxWidth = firstAssetVideoTrack.naturalSize.height;
    CGFloat maxHeight = firstAssetVideoTrack.naturalSize.width;

    for (AVAsset *asset in assets)
    {
        AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
        if (assetVideoTrack.naturalSize.width > maxWidth)
            maxWidth = assetVideoTrack.naturalSize.width;
        if (assetVideoTrack.naturalSize.height > maxHeight)
            maxHeight = assetVideoTrack.naturalSize.height;
    }

    return CGSizeMake(maxWidth, maxHeight);
}

@end

Thank you for your attention. I am really tired, I've been trying to find the bug for four hours straight. I'll go to sleep now.

Upvotes: 4

Views: 12744

Answers (2)

Dinesh Kumar
Dinesh Kumar

Reputation: 151

Error:- domain: "AVFoundationErrorDomain" - code: 18446744073709539816

Solution:- [Swift 5.5] Stop running mutiple av player in back ground thread.

Upvotes: -2

eoLithic
eoLithic

Reputation: 883

I've finally found the solution. The description of error lead me in the wrong direction: "Cannot Decode. The media data could not be decoded. It may be damaged.". From this description you may think that there is something wrong with your video files. I've spent 5 hours experimenting with formats, debugging and etc.

Well, THE ANSWER IS COMPLETELY DIFFERENT!

My mistake was that I forgot that CMTimeADD() returns value. I thought that it changes the value of its first argument, and in the code you can see this:

CMTime currentTime = kCMTimeZero;
for (int i = (int)assets.count - 1; i >= 0; i--)
{
    CMTimeAdd(currentTime, asset.duration); //HERE!! I don't actually increment the value! currentTime is always kCMTimeZero
}
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, _composition.duration); // And that's where everything breaks!

The lesson that I've learned: When working with AVFoundation always check your time values! It's very important, otherwise you'll get a lot of bugs.

Upvotes: 6

Related Questions