Amogh Talpallikar
Amogh Talpallikar

Reputation: 12184

How to get first frame of a video as the view loads?

As soon as View Loads I want to display a paused video, with its first frame on it. For this I want to get the first frame of the video as a UImage to display till the video is played.

Is there anyway in iOS to get a frame of a video file kept locally via its URL ?

UPDATE: Please consider both My solution and Stavash's as well and see whichever suits you.

Upvotes: 1

Views: 2464

Answers (2)

Amogh Talpallikar
Amogh Talpallikar

Reputation: 12184

I found a way that works fine as well ! Checkout the Apple documentation on AVAssetImageGenerator class and AV Foundation Programming Guide. It has a method called copyCGImageAtTime:actualTime:error:

Use AVasset class of AVFoundation FrameWork . AVasset has a class method + assetWithURL: that gives you an asset object from URL of a file. this method is only available for iOS 5 or above, in case we can do a respondsToSelector and if it fails then use

+ (AVURLAsset *)URLAssetWithURL:(NSURL *)URL options:(NSDictionary *)options

AVURLAsset is a concrete subclass of AVAsset and can be used when the URL of the local video file is available.

Once the asset is created, it has properties like naturalSize,duration and a lots of methods to work on media files.

use + assetImageGeneratorWithAsset: of AVAssetImageGenerator and pass the asset to it and create an Imagegenerator.

Call the above method to get a CGImageRef and then use UIImage's imageWithCGImage to get the image and set it to UIImageView's image property!

Plain and Simple !

NOTE: Don't forget to add the Core Media frameWork for CMTime you will pass to get the corresponding Image-Frame.

also add the AVFoundation FrameWork for all the AVFoundation classes.

Upvotes: 2

Stavash
Stavash

Reputation: 14304

I found the following code, it might help you:

-(void) writeVideoFrameAtTime:(CMTime)time {
   if (![videoWriterInput isReadyForMoreMediaData]) {
       NSLog(@"Not ready for video data");
   }
   else {
       @synchronized (self) {
           UIImage* newFrame = [self.currentScreen retain];
           CVPixelBufferRef pixelBuffer = NULL;
           CGImageRef cgImage = CGImageCreateCopy([newFrame CGImage]);
           CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage));

           int status = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, avAdaptor.pixelBufferPool, &pixelBuffer);
           if(status != 0){
               //could not get a buffer from the pool
               NSLog(@"Error creating pixel buffer:  status=%d", status);
           }
                       // set image data into pixel buffer
           CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
           uint8_t* destPixels = CVPixelBufferGetBaseAddress(pixelBuffer);
           CFDataGetBytes(image, CFRangeMake(0, CFDataGetLength(image)), destPixels);  //XXX:  will work if the pixel buffer is contiguous and has the same bytesPerRow as the input data

           if(status == 0){
               BOOL success = [avAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:time];
               if (!success)
                   NSLog(@"Warning:  Unable to write buffer to video");
           }

           //clean up
           [newFrame release];
           CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
           CVPixelBufferRelease( pixelBuffer );
           CFRelease(image);
           CGImageRelease(cgImage);
       }

   }

}

Taken from http://codethink.no-ip.org/wordpress/archives/673#comment-8146

Upvotes: 1

Related Questions