Reputation:
I recorded a video with a bluescreen. We have the software to convert that video to a transparent background. What's the best way to play this video overlaid on a custom UIView? Anytime I've seen videos on the iPhone it always launches that player interface. Any way to avoid this?
Upvotes: 30
Views: 15546
Reputation: 494
I had to do it in one of my apps and couldn't find any complete answer, so here it is how to display really transparent video without GPUImage in SwiftUI with HEVC codec. ProRes format is ~200mbps, you want to use HEVC.
Video was exported from Premiere Pro as ProRes 4444 with Alpha and then converted to HEVC with Alpha.
File inspector should show it as this:
Test video: download here
import AVKit
import SwiftUI
import SpriteKit
struct MainView: View {
@State var mainScene = SKScene()
var body: some View {
GeometryReader { geometry in
Color.black
.ignoresSafeArea()
ZStack {
Text("TEXT")
.foregroundColor(.white)
.font(.system(size: 140))
SpriteView(scene: mainScene, options: [.allowsTransparency])
.frame(maxWidth: .infinity, maxHeight: .infinity)
.background(.clear)
.transition(.opacity)
}.onAppear {
let scene = SKScene(size: .init(width: geometry.size.width, height: 234))
scene.scaleMode = .aspectFit
scene.backgroundColor = .clear
let url = Bundle.main.url(forResource: "test animation HEVC", withExtension: "mov")!
let asset = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
let player = AVPlayer(playerItem: playerItem)
let video = SKVideoNode(avPlayer: player)
video.size = CGSize(width: geometry.size.width, height: 234)
video.anchorPoint = .init(x: 0, y: 0)
scene.addChild(video)
player.play()
mainScene = scene
NotificationCenter.default.addObserver(forName: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: nil, queue: nil) { notification in
player.seek(to: .zero)
player.play()
}
}
}
}
}
Key parts:
Upvotes: 0
Reputation: 571
Don't know if anyone is still interested in this besides me, but I'm using GPUImage and the Chromakey filter to achieve this^^ https://github.com/BradLarson/GPUImage
EDIT: example code of what I did (may be dated now):
-(void)AnimationGo:(GPUImageView*)view {
NSURL *url = [[NSBundle mainBundle] URLForResource:@"test" withExtension:@"mov"];
movieFile = [[GPUImageMovie alloc] initWithURL:url];
filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[movieFile addTarget:filter];
GPUImageView* imageView = (GPUImageView*)view;
[imageView setBackgroundColorRed:0.0 green:0.0 blue:0.0 alpha:0.0];
imageView.layer.opaque = NO;
[filter addTarget:imageView];
[movieFile startProcessing];
//to loop
[imageView setCompletionBlock:^{
[movieFile removeAllTargets];
[self AnimationGo:view];
}];
}
I may have had to modify GPUImage a bit, and it may not work with the latest version of GPUImage but that's what we used
Upvotes: 9
Reputation: 4425
The GPUImage would work, but it is not perfect because the iOS device is not the place to do your video processing. You should do all your on the desktop using a professional video tool that handles chromakey, then export a video with an alpha channel. Then import the video into your iOS application bundle as described at playing-movies-with-an-alpha-channel-on-the-ipad. There are a lot of quality and load time issues you can avoid by making sure your video is properly turned into an alpha channel video before it is loaded onto the iOS device.
Upvotes: 1
Reputation: 4008
I'm assuming what you're trying to do is actually remove the blue screen in real-time from your video: you'll need to play the video through OpenGL, run pixel shaders on the frames and finally render everything using an OpenGL layer with a transparent background.
See the Capturing from the Camera using AV Foundation on iOS 5 session from WWDC 2011 which explains techniques to do exactly that (watch Chroma Key demo at 9:00). Presumably the source can be downloaded but I can't find the link right now.
Upvotes: 2
Reputation: 297
You'll need to build a custom player using AVFoundation.framework and then use a video with alpha channel. The AVFoundation framework allows much more robust handeling of video without many of the limitations of MPMedia framework. Building a custom player isn't as hard as people make it out to be. I've written a tutorial on it here: http://www.sdkboy.com/?p=66
Upvotes: 4
Reputation: 36048
if you could extract the frames from your video and save them as images then your video could be reproduced by changing the images. Here is an example of how you could reproduce your images so that it looks like a video:
in this image that I uploaded the names of the images have a different name but if you name your images as: frame1, fram2, fram3.... then you could place that inside a loop.
I have never tried it I just know it works for simple animations. Hope it works.
Upvotes: -4
Reputation: 2046
You can't avoid launching the player interface if you want to use the built-in player.
Here's what I would try:
I don't think you can do much better than this without writing your own player, and I have no idea if this method would work or not.
Depending on the size of your videos and what you're trying to do, you could also try messing around with animated GIFs.
Upvotes: -4
Reputation: 24040
I'm not sure the iPhone APIs will let you have a movie view over the top of another view and still have transparency.
Upvotes: 0
Reputation:
Only way to avoid using the player interface is to roll your own video player, which is pretty difficult to do right. You can insert a custom overlay on top of the player interface to make it look like the user is still in your app, but you don't actually have control of the view. You might want to try playing your transparent video in the player interface and see if it shows up as transparent. See if there is a property for the background color in the player. You would want to set that to be transparent too.
--Mike
Upvotes: 0