Reputation: 62323
I need to create a variable length silent "video" (ie its just an image) that I can use in an AVPlayer on ios.
Does anyone know of a way that I can create an AVPlayerItem which simply consists of an image that lasts for n seconds?
If I have to generate a .mov file I would need that file to be very small.
Upvotes: 1
Views: 1437
Reputation: 62323
Ok I've gone with writing my own video. It turns out that if you write a video with the image you want at the first and last key frames (and those are the only key frames) then you get a nice compact video that doesn't take "too" long to write.
My code is as follows:
- (CVPixelBufferRef) createPixelBufferOfSize: (CGSize) size fromUIImage: (UIImage*) pImage
{
NSNumber* numYes = [NSNumber numberWithBool: YES];
NSDictionary* pOptions = [NSDictionary dictionaryWithObjectsAndKeys: numYes, kCVPixelBufferCGImageCompatibilityKey,
numYes, kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef retBuffer = NULL;
CVReturn status = CVPixelBufferCreate( kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef)pOptions, &retBuffer );
CVPixelBufferLockBaseAddress( retBuffer, 0 );
void* pPixelData = CVPixelBufferGetBaseAddress( retBuffer );
CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate( pPixelData, size.width, size.height, 8, 4 * size.width, colourSpace, (CGBitmapInfo)kCGImageAlphaNoneSkipFirst );
CGSize inSize = pImage.size;
float inAspect = inSize.width / inSize.height;
float outAspect = size.width / size.height;
CGRect drawRect;
if ( inAspect > outAspect )
{
float scale = inSize.width / size.width;
CGSize outSize = CGSizeMake( size.width, inSize.height / scale );
drawRect = CGRectMake( 0, (size.height / 2) - (outSize.height / 2), outSize.width, outSize.height );
}
else
{
float scale = inSize.height / size.height;
CGSize outSize = CGSizeMake( inSize.width / scale, size.height );
drawRect = CGRectMake( (size.width / 2) - (outSize.width / 2), 0, outSize.width, outSize.height );
}
CGContextDrawImage( context, drawRect, [pImage CGImage] );
CGColorSpaceRelease( colourSpace );
CGContextRelease( context );
CVPixelBufferUnlockBaseAddress( retBuffer, 0 );
return retBuffer;
}
- (void) writeVideo: (NSURL*) pURL withImage: (UIImage*) pImage ofLength: (NSTimeInterval) length
{
[[NSFileManager defaultManager] removeItemAtURL: pURL error: nil];
NSError* pError = nil;
AVAssetWriter* pAssetWriter = [AVAssetWriter assetWriterWithURL: pURL fileType: AVFileTypeQuickTimeMovie error: &pError];
const int kVidWidth = 1920;//pImage.size.width;
const int kVidHeight = 1080;//pImage.size.height;
NSNumber* numVidWidth = [NSNumber numberWithInt: kVidWidth];
NSNumber* numVidHeight = [NSNumber numberWithInt: kVidHeight];
NSDictionary* pVideoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey,
numVidWidth, AVVideoWidthKey,
numVidHeight, AVVideoHeightKey,
nil];
AVAssetWriterInput* pAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo
outputSettings: pVideoSettings];
[pAssetWriter addInput: pAssetWriterInput];
AVAssetWriterInputPixelBufferAdaptor* pAssetWriterInputPixelBufferAdaptor =
[AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput: pAssetWriterInput
sourcePixelBufferAttributes: pVideoSettings];
__block volatile int finished = 0;
[pAssetWriter startWriting];
[pAssetWriter startSessionAtSourceTime: kCMTimeZero];
// Write the image.
CVPixelBufferRef pixelBuffer = [self createPixelBufferOfSize: CGSizeMake( kVidWidth, kVidHeight ) fromUIImage: pImage];
[pAssetWriterInputPixelBufferAdaptor appendPixelBuffer: pixelBuffer withPresentationTime: kCMTimeZero];
[pAssetWriterInputPixelBufferAdaptor appendPixelBuffer: pixelBuffer withPresentationTime: CMTimeMake( length * 1000000, 1000000 )];
CVPixelBufferRelease( pixelBuffer );
[pAssetWriterInput markAsFinished];
// Set end time accurate to micro-seconds.
[pAssetWriter endSessionAtSourceTime: CMTimeMake( length * 1000000, 1000000 )];
[pAssetWriter finishWritingWithCompletionHandler: ^
{
OSAtomicIncrement32( &finished );
}];
// Wait for the writing to complete.
while( finished == 0 )
{
[NSThread sleepForTimeInterval: 0.01];
}
}
You may note that I am setting the video to always be 1920x1080 and letterboxing the image in place.
Upvotes: 2
Reputation: 1388
You can create a .mov video from that image, which plays a very short time, let's say a second, and loop this video with
yourplayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:[yourplayer currentItem]];
- (void)playerItemDidReachEnd:(NSNotification *)notification {
[[yourplayer currentItem] seekToTime:kCMTimeZero];
}
If the video has a duration of n
seconds, then you can use a counter in your playerItemDidReachEnd
method and set a limit.
Upvotes: 1