Reputation: 1046
I have three videos. The first is from the rear camera. The second is from the front camera and the third is again from the rear camera. The videos are always taken in landscape mode with the home button on the right.
The rear facing videos are in correct orientation. The center video, taken using the front camera, is rotated at 180degrees (upside down). I have been researching and trying numerous methods to transform the center video with no luck. I get the same results every time.
I am getting pretty frustrated with this whole process. Everything I read online and the comments/suggestions from the reviewer here should work but it does not work. The video is the same no matter what I try for transformations. It continually acts as if I did not apply any transformations. Nothing. I do not understand why the transformations are ignored on this. I have spent weeks on this and I am at the end - it simply does not work.
Here is the current iteration of my code:
- (void)mergeVideos2:(NSMutableArray *)assets withCompletion:(void (^)(NSString *))completion {
AVMutableComposition *mutableComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
__block NSMutableArray *instructions = [[NSMutableArray alloc] init];
__block CGSize size = CGSizeZero;
__block CMTime time = kCMTimeZero;
__block AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
__block CGAffineTransform transformflip = CGAffineTransformMakeScale(1, -1);
// __block CGAffineTransform transformflip = CGAffineTransformMakeRotation(M_PI);
__block int32_t commontimescale = 600;
[assets enumerateObjectsUsingBlock:^(id _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
NSURL *assetUrl = (NSURL *)obj;
AVAsset *asset = [AVAsset assetWithURL:assetUrl];
CMTime cliptime = CMTimeConvertScale(asset.duration, commontimescale, kCMTimeRoundingMethod_QuickTime);
NSLog(@"%s: Number of tracks: %lu", __PRETTY_FUNCTION__, (unsigned long)[[asset tracks] count]);
AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
NSError *error;
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, cliptime)
ofTrack:assetTrack
atTime:time
error:&error];
if (error) {
NSLog(@"%s: Error - %@", __PRETTY_FUNCTION__, error.debugDescription);
}
AVMutableVideoCompositionLayerInstruction *videoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];
CGAffineTransform transform = assetTrack.preferredTransform;
[videoLayerInstruction setTransform:CGAffineTransformConcat(transform, transformflip) atTime:time];
// the main instruction set - this is wrapping the time
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(time, assetTrack.timeRange.duration);
if (videoLayerInstruction != nil)
videoCompositionInstruction.layerInstructions = @[videoLayerInstruction];
[instructions addObject:videoCompositionInstruction];
// time increment variables
time = CMTimeAdd(time, cliptime);
if (CGSizeEqualToSize(size, CGSizeZero)) {
size = assetTrack.naturalSize;;
}
}];
mutableVideoComposition.instructions = instructions;
// set the frame rate to 9fps
mutableVideoComposition.frameDuration = CMTimeMake(1, 12);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths firstObject];
int number = arc4random_uniform(10000);
self.outputFile = [documentsDirectory stringByAppendingFormat:@"/export_%i.mov",number];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition
presetName:AVAssetExportPreset1280x720];
exporter.outputURL = [NSURL fileURLWithPath:self.outputFile];
//Set the output file type
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
dispatch_group_t group = dispatch_group_create();
dispatch_group_enter(group);
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_group_leave(group);
}];
dispatch_group_notify(group, dispatch_get_main_queue(), ^{
// get the size of the file
unsigned long long size= ([[[NSFileManager defaultManager] attributesOfItemAtPath:self.outputFile error:nil] fileSize]);
NSString *filesize = [NSByteCountFormatter stringFromByteCount:size countStyle:NSByteCountFormatterCountStyleFile];
NSString *thereturn = [NSString stringWithFormat:@"%@: %@", self.outputFile, filesize];
NSLog(@"Export File (Final) - %@", self.outputFile);
completion(thereturn);
});
}
Any ideas or suggestions?
Upvotes: 1
Views: 1947
Reputation: 3216
Each AVAssetTrack has a preferredTransform
property. It contains information on how to rotate and translate the video to display it properly so you don't have to guess. Use each video's preferredTransform in each layer instruction.
Don't set "videoCompositionTrack.preferredTransform = ..."
Remove the transform ramp "[videoLayerInstruction setTransformRampFromStartTransform:..."
In that enumeration, just use:
CGAffineTransform transform = assetTrack.preferredTransform;
[videoLayerInstruction setTransform:transform atTime:time];
I'm assuming your videos are shot with the same dimensions as your output, with the middle video having its width and height reversed. If they are not, you'll have to add the appropriate scaling:
float scaleFactor = ...// i.e. (outputWidth / videoWidth)
CGAffineTransform scale = CGAffineTransformMakeScale(scaleFactor,scaleFactor)
transform = CGAffineTransformConcat(transform, scale);
[videoLayerInstruction setTransform:transform atTime:time];
EDIT: It appears that the source videos that appeared upside down in the composition were upside down to begin with, but had an identity CGAffineTransform. This code worked to show them in the correct orientation:
- (void)mergeVideos2:(NSMutableArray *)assets withCompletion:(void (^)(NSString *))completion {
AVMutableComposition *mutableComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
__block NSMutableArray *instructions = [[NSMutableArray alloc] init];
__block CMTime time = kCMTimeZero;
__block AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
__block int32_t commontimescale = 600;
// Create one layer instruction. We have one video track, and there should be one layer instruction per video track.
AVMutableVideoCompositionLayerInstruction *videoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];
[assets enumerateObjectsUsingBlock:^(id _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
NSURL *assetUrl = (NSURL *)obj;
AVAsset *asset = [AVAsset assetWithURL:assetUrl];
CMTime cliptime = CMTimeConvertScale(asset.duration, commontimescale, kCMTimeRoundingMethod_QuickTime);
NSLog(@"%s: Number of tracks: %lu", __PRETTY_FUNCTION__, (unsigned long)[[asset tracks] count]);
AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
CGSize naturalSize = assetTrack.naturalSize;
NSError *error;
//insert the video from the assetTrack into the composition track
[videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, cliptime)
ofTrack:assetTrack
atTime:time
error:&error];
if (error) {
NSLog(@"%s: Error - %@", __PRETTY_FUNCTION__, error.debugDescription);
}
CGAffineTransform transform = assetTrack.preferredTransform;
//set the layer to have this videos transform at the time that this video starts
if (<* the video is an intermediate video - has the wrong orientation*>) {
//these videos have the identity transform, yet they are upside down.
//we need to rotate them by M_PI radians (180 degrees) and shift the video back into place
CGAffineTransform rotateTransform = CGAffineTransformMakeRotation(M_PI);
CGAffineTransform translateTransform = CGAffineTransformMakeTranslation(naturalSize.width, naturalSize.height);
[videoLayerInstruction setTransform:CGAffineTransformConcat(rotateTransform, translateTransform) atTime:time];
} else {
[videoLayerInstruction setTransform:transform atTime:time];
}
// time increment variables
time = CMTimeAdd(time, cliptime);
}];
// the main instruction set - this is wrapping the time
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,mutableComposition.duration); //make the instruction last for the entire composition
videoCompositionInstruction.layerInstructions = @[videoLayerInstruction];
[instructions addObject:videoCompositionInstruction];
mutableVideoComposition.instructions = instructions;
// set the frame rate to 9fps
mutableVideoComposition.frameDuration = CMTimeMake(1, 12);
//set the rendersize for the video we're about to write
mutableVideoComposition.renderSize = CGSizeMake(1280,720);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths firstObject];
int number = arc4random_uniform(10000);
self.outputFile = [documentsDirectory stringByAppendingFormat:@"/export_%i.mov",number];
//let the rendersize of the video composition dictate size. use quality preset here
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL = [NSURL fileURLWithPath:self.outputFile];
//Set the output file type
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = mutableVideoComposition;
dispatch_group_t group = dispatch_group_create();
dispatch_group_enter(group);
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_group_leave(group);
}];
dispatch_group_notify(group, dispatch_get_main_queue(), ^{
// get the size of the file
unsigned long long size= ([[[NSFileManager defaultManager] attributesOfItemAtPath:self.outputFile error:nil] fileSize]);
NSString *filesize = [NSByteCountFormatter stringFromByteCount:size countStyle:NSByteCountFormatterCountStyleFile];
NSString *thereturn = [NSString stringWithFormat:@"%@: %@", self.outputFile, filesize];
NSLog(@"Export File (Final) - %@", self.outputFile);
completion(thereturn);
});
}
Upvotes: 4