Nicolas Manzini
Nicolas Manzini

Reputation: 8546

How to create a dummy AVPlayerItem with a real duration?

I'm using an AVPlayer to play CAKeyFrameAnimations on a AVSynchronizedLayer. In order to keep the player playing as I do not play an AVAsset during the animation, I set the forwardPlaybackEndTime of the AVPlayerItem to the desired animation's duration. Unfortunately. It seems impossible to seekToTime: during this forwardPlaybackEndTime as the AVPlayer always goes back to the beginning. Probably because it tries to seek into the AVplayerItem's duration.

How could I create a dummy AVPlayerItem with a real duration to trick the AVPlayer to play some empty AVPlayerItem and letting me seekToTime?

Upvotes: 3

Views: 1669

Answers (2)

kacho
kacho

Reputation: 392

I think it would be easier if you just create an AVMutableComposition with a AVMutableCompositionTrack in which you put an empty range of the desired duration using insertEmptyTimeRange:.

Then use this composition to create your AVPlayerItem - playerItemWithAsset: as it is a subclass of AVAsset.

This won't require any generating and writing then reading and is much less code as well.

Upvotes: 2

Nicolas Manzini
Nicolas Manzini

Reputation: 8546

Unfortunately, seekToTime will only seek into the AVPlayerItem's duration. Thus it is required to create a dummy player item to generate a seek-able duration. In order to do so rapidly, one need to create a dummy AVplayerItem. Here's an example of implementation to generate such an item. It's long but it's required. Good luck!

@interface FakeAsset ()

+ (CVPixelBufferRef)blackImagePixelBuffer;

@end

@implementation FakeAsset

+ (void)assetWithDuration:(CMTime)duration
        completitionBlock:(void (^)(AVAsset *))callBack
{
    NSError * error      = nil;
    NSString * assetPath = nil;
    NSUInteger i         = 0;
    do
    {
        assetPath =
        [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"dummyAsset%i.m4v",i]];
        i++;
    }
    while ([[NSFileManager defaultManager] fileExistsAtPath:assetPath
                                                isDirectory:NO]);

    NSURL * fileURL = [NSURL fileURLWithPath:assetPath];

    NSParameterAssert(fileURL);

    AVAssetWriter * videoWriter =
    [[AVAssetWriter alloc] initWithURL:fileURL
                              fileType:AVFileTypeAppleM4V
                                 error:&error];
    NSParameterAssert(videoWriter);

    NSDictionary * compression  =
  @{
    AVVideoAverageBitRateKey      : @10,
    AVVideoProfileLevelKey        : AVVideoProfileLevelH264Main31,
    AVVideoMaxKeyFrameIntervalKey : @300
    };

    NSDictionary * outputSettings =
  @{
    AVVideoCodecKey                 : AVVideoCodecH264,
    AVVideoCompressionPropertiesKey : compression,
    AVVideoWidthKey                 : @120,
    AVVideoHeightKey                : @80
    };

    AVAssetWriterInput * videoWriterInput =
    [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                       outputSettings:outputSettings];
    NSParameterAssert(videoWriterInput);

    NSDictionary * parameters =
    @{(NSString *)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32ARGB),
      (NSString *)kCVPixelBufferWidthKey           : @120,
      (NSString *)kCVPixelBufferHeightKey          : @80
      };

    AVAssetWriterInputPixelBufferAdaptor * adaptor =
    [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                                     sourcePixelBufferAttributes:parameters];
    NSParameterAssert(adaptor);
    NSParameterAssert([videoWriter canAddInput:videoWriterInput]);

    videoWriterInput.expectsMediaDataInRealTime = NO;

    [videoWriter addInput:videoWriterInput];

    NSParameterAssert([videoWriter startWriting]);

    [videoWriter startSessionAtSourceTime:kCMTimeZero];

    dispatch_queue_t dispatchQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);

    [videoWriterInput requestMediaDataWhenReadyOnQueue:dispatchQueue
                                            usingBlock:^
    {
        int frame = 0;
        while (videoWriterInput.isReadyForMoreMediaData)
        {
            if (frame < 2)
            {
                CMTime frameTime = frame ? duration : kCMTimeZero;
                CVPixelBufferRef buffer = [self blackImagePixelBuffer];

                [adaptor appendPixelBuffer:buffer
                      withPresentationTime:frameTime];

                CVBufferRelease(buffer);

                ++frame;
            }
            else
            {
                [videoWriterInput markAsFinished];
                [videoWriter endSessionAtSourceTime:duration];

                dispatch_async(dispatch_get_main_queue(), ^
                {
                    [videoWriter finishWritingWithCompletionHandler:^()
                     {
                         NSLog(@"did finish writing the video!");
                         AVURLAsset * asset =
                         [AVURLAsset assetWithURL:videoWriter.outputURL];
                         callBack(asset);
                     }];
                });
                break;
            }
        }
    }];
}

+ (CVPixelBufferRef)blackImagePixelBuffer
{
    NSDictionary * options =
    @{
      (id)kCVPixelBufferCGImageCompatibilityKey         : @YES,
      (id)kCVPixelBufferCGBitmapContextCompatibilityKey : @YES
      };

    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status =
    CVPixelBufferCreate(kCFAllocatorDefault, 120, 80, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef)options, &pxbuffer);

    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);

    void * pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    //kCGImageAlphaPremultipliedFirst
    CGContextRef context = CGBitmapContextCreate(pxdata, 120, 80, 8, 4*120, rgbColorSpace, (CGBitmapInfo)kCGImageAlphaNoneSkipFirst);

    NSParameterAssert(context);
    CGContextSetFillColorWithColor(context, [UIColor blackColor].CGColor);
    CGContextFillRect(context,CGRectMake(0.f, 0.f, 120.f, 80.f));
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

Upvotes: 2

Related Questions