Reputation: 3476
I am trying to crop a square frame from a video. Following is the process
Now what happens is, the video comes out correct but it's duration varies in some cases
The change in the duration is from 0.1 to 1 second. It might be a very small change but where I need this process that video duration has to be precise.
I am adding the code if you want to dive deep in it.
AVAsset *asset ;
asset = [AVAsset assetWithURL:customURL];
//create an avassetrack with our asset
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CMTime originalVideoDur = asset.duration;
float orgDurFloat = (float)originalVideoDur.value / (float)originalVideoDur.timescale;
//create a video composition and preset some settings
AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, 30);
//here we are setting its render size to its height x height (Square)
CGFloat outputWidth = UIScreen.mainScreen.bounds.size.width * UIScreen.mainScreen.scale;
videoComposition.renderSize = CGSizeMake(outputWidth, outputWidth);
//create a video instruction
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
CGAffineTransform finalTransform = [self getOutputTransformOfAsset:asset track:clipVideoTrack];
[transformer setTransform:finalTransform atTime:kCMTimeZero];
//add the transformer layer instructions, then add to video composition
instruction.layerInstructions = [NSArray arrayWithObject:transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];
//Create an Export Path to store the cropped video
NSString * documentsPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *exportPath = [documentsPath stringByAppendingFormat:@"/CroppedVideo2.mp4"];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
//Remove any prevouis videos at that path
[[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil];
//Export
exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComposition;
exporter.outputURL = exportUrl;
exporter.outputFileType = AVFileTypeMPEG4;
exporter.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
[exporter exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(), ^{
//Call when finished
});
}];
The things I tested and not work are:
Upvotes: 1
Views: 809
Reputation: 3476
Found the issue: It's not an issue to be honest, it's kind of system bug. The Exporter was ignoring last static frames for no reason. at the point where I am setting transform at kCMTimeZero, I added new line where I set the same transform at the end of video.
[transformer setTransform:finalTransform atTime:asset.duration];
Now the exporter doesn't ignore the last few frames.
Upvotes: 1