Liron
Liron

Reputation: 2032

AVAssetExportSession combine video files and freeze frame between videos

I have an app which combines video files together to make a long video. There could be a delay between videos (e.g. V1 starts at t=0s and runs for 5 seconds, V1 starts at t=10s). In this case, I want the video to freeze the last frame of V1 until V2 starts.

I'm using the code below, but between videos, the whole video goes white.

Any ideas how I can get the effect I'm looking for?

Thanks!

@interface VideoJoins : NSObject

-(instancetype)initWithURL:(NSURL*)url
                  andDelay:(NSTimeInterval)delay;

@property (nonatomic, strong) NSURL* url;
@property (nonatomic) NSTimeInterval delay;

@end

and

+(void)joinVideosSequentially:(NSArray*)videoJoins
                 withFileType:(NSString*)fileType
                     toOutput:(NSURL*)outputVideoURL
                 onCompletion:(dispatch_block_t) onCompletion
                      onError:(ErrorBlock) onError
                     onCancel:(dispatch_block_t) onCancel
{
  //From original question on http://stackoverflow.com/questions/6575128/how-to-combine-video-clips-with-different-orientation-using-avfoundation
  // Didn't add support for portrait+landscape.
  AVMutableComposition *composition = [AVMutableComposition composition];

  AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

  AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

  CMTime startTime = kCMTimeZero;

  /*videoClipPaths is a array of paths of the video clips recorded*/

  //for loop to combine clips into a single video
  for (NSInteger i=0; i < [videoJoins count]; i++)
  {
    VideoJoins* vj = videoJoins[i];
    NSURL *url  = vj.url;
    NSTimeInterval nextDelayTI = 0;
    if(i+1 < [videoJoins count])
    {
      VideoJoins* vjNext = videoJoins[i+1];
      nextDelayTI = vjNext.delay;
    }

    AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];

    CMTime assetDuration = [asset duration];
    CMTime assetDurationWithNextDelay = assetDuration;
    if(nextDelayTI != 0)
    {
      CMTime nextDelay = CMTimeMakeWithSeconds(nextDelayTI, 1000000);
      assetDurationWithNextDelay = CMTimeAdd(assetDuration, nextDelay);
    }

    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    AVAssetTrack *audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

    //set the orientation
    if(i == 0)
    {
      [compositionVideoTrack setPreferredTransform:videoTrack.preferredTransform];
    }

    BOOL ok = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetDurationWithNextDelay) ofTrack:videoTrack atTime:startTime error:nil];
    ok = [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetDuration) ofTrack:audioTrack atTime:startTime error:nil];

    startTime = CMTimeAdd(startTime, assetDurationWithNextDelay);
  }

  //Delete output video if it exists
  NSString* outputVideoString = [outputVideoURL absoluteString];
  if ([[NSFileManager defaultManager] fileExistsAtPath:outputVideoString])
  {
    [[NSFileManager defaultManager] removeItemAtPath:outputVideoString error:nil];
  }

  //export the combined video
  AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:composition
                                                                    presetName:AVAssetExportPresetHighestQuality];

  exporter.outputURL = outputVideoURL;
  exporter.outputFileType = fileType;
  exporter.shouldOptimizeForNetworkUse = YES;

  [exporter exportAsynchronouslyWithCompletionHandler:^(void)
  {
    switch (exporter.status)
    {
      case AVAssetExportSessionStatusCompleted: {
        onCompletion();
        break;
      }
      case AVAssetExportSessionStatusFailed:
      {
        NSLog(@"Export Failed");
        NSError* err = exporter.error;
        NSLog(@"ExportSessionError: %@", [err localizedDescription]);
        onError(err);
        break;
      }
      case AVAssetExportSessionStatusCancelled:
        NSLog(@"Export Cancelled");
        NSLog(@"ExportSessionError: %@", [exporter.error localizedDescription]);
        onCancel();
        break;
    }
  }];
}

EDIT: Got it working. Here is how I extract the images and generate the videos from those images:

+ (void)writeImageAsMovie:(UIImage*)image
                   toPath:(NSURL*)url
                 fileType:(NSString*)fileType
                 duration:(NSTimeInterval)duration
               completion:(VoidBlock)completion
{
  NSError *error = nil;
  AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:url
                                                         fileType:fileType
                                                            error:&error];
  NSParameterAssert(videoWriter);

  CGSize size = image.size;

  NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                 AVVideoCodecH264, AVVideoCodecKey,
                                 [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                                 [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                                 nil];
  AVAssetWriterInput* writerInput = [AVAssetWriterInput
                                      assetWriterInputWithMediaType:AVMediaTypeVideo
                                      outputSettings:videoSettings];

  AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                   assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                   sourcePixelBufferAttributes:nil];
  NSParameterAssert(writerInput);
  NSParameterAssert([videoWriter canAddInput:writerInput]);
  [videoWriter addInput:writerInput];

  //Start a session:
  [videoWriter startWriting];
  [videoWriter startSessionAtSourceTime:kCMTimeZero];

  //Write samples:
  CMTime halfTime = CMTimeMakeWithSeconds(duration/2, 100000);
  CMTime endTime = CMTimeMakeWithSeconds(duration, 100000);
  CVPixelBufferRef buffer = [VideoCreator pixelBufferFromCGImage:image.CGImage];
  [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
  [adaptor appendPixelBuffer:buffer withPresentationTime:halfTime];
  [adaptor appendPixelBuffer:buffer withPresentationTime:endTime];

  //Finish the session:
  [writerInput markAsFinished];
  [videoWriter endSessionAtSourceTime:endTime];
  [videoWriter finishWritingWithCompletionHandler:^{
    if(videoWriter.error)
    {
      NSLog(@"Error:%@", [error localizedDescription]);
    }
    if(completion)
    {
      completion();
    }
  }];
}

+(void)generateVideoImageFromURL:(NSURL*)url
                          atTime:(CMTime)thumbTime
                     withMaxSize:(CGSize)maxSize
                      completion:(ImageBlock)handler
{
  AVURLAsset *asset=[[AVURLAsset alloc] initWithURL:url options:nil];

  if(!asset)
  {
    if(handler)
    {
      handler(nil);
      return;
    }
  }
  if(CMTIME_IS_POSITIVE_INFINITY(thumbTime))
  {
    thumbTime = asset.duration;
  }
  else if(CMTIME_IS_NEGATIVE_INFINITY(thumbTime) || CMTIME_IS_INVALID(thumbTime) || CMTIME_IS_INDEFINITE(thumbTime))
  {
    thumbTime = CMTimeMake(0, 30);
  }

  AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
  generator.appliesPreferredTrackTransform=TRUE;
  generator.maximumSize = maxSize;

  CMTime actualTime;
  NSError* error;
  CGImageRef image = [generator copyCGImageAtTime:thumbTime actualTime:&actualTime error:&error];
  UIImage *thumb = [[UIImage alloc] initWithCGImage:image];
  CGImageRelease(image);

  if(handler)
  {
    handler(thumb);
  }
}

Upvotes: 4

Views: 1961

Answers (2)

Patrick Michael
Patrick Michael

Reputation: 1

The first frame of video asset is always black or white

 CMTime delta = CMTimeMake(1, 25); //1 frame (if fps = 25)
 CMTimeRange timeRangeInVideoAsset = CMTimeRangeMake(delta,clipVideoTrack.timeRange.duration);
 nextVideoClipStartTime = CMTimeAdd(nextVideoClipStartTime, timeRangeInVideoAsset.duration);

Merged more then 400 shirt videos in one.

Upvotes: 0

blancos
blancos

Reputation: 1614

AVMutableComposition can only stitch videos together. I did it by doing two things:

  • Extracting last frame of the first video as image.
  • Making a video using this image(duration depends on your requirement).

Then you can compose these three videos (V1,V2 and your single image video). Both tasks are very easy to do.

For extracting the image out of the video, look at this link. If you don't want to use MPMoviePlayerController,which is used by accepted answer, then look at other answer by Steve.

For making video using the image check out this link. Question is about the issue of audio but I don't think you need audio. So just look at the method mentioned in question itself.

UPDATE: There is an easier way but it comes with a disadvantage. You can have two AVPlayer. First one plays your video which has white frames in between. Other one sits behind paused at last frame of video 1. So when the middle part comes, you will see the second AVPlayer loaded with last frame. So as a whole it will look like video 1 is paused. And trust me naked eye can't make out when player got changed. But the obvious disadvantage is that your exported video will be same with blank frames. So if you are just going to play it back in your app only, you can go with this approach.

Upvotes: 3

Related Questions