Emma Labbé
Emma Labbé

Reputation: 697

Picture in Picture from AVSampleBufferDisplayLayer not loading

I'm trying to support Picture in Picture for my iOS app and I need to display the content of a view, not a video. So I tried to use a library to record a view and show the video in a AVSampleBufferDisplayLayer. It works, the content of the view is displayed in the buffer display layer, but when I try to use PIP, only a loading indicator is shown. Here is my code:

import UIKit
import AVKit

class View: UIView {
    
    override class var layerClass: AnyClass {
        AVSampleBufferDisplayLayer.self
    }
}

class ViewController: UIViewController, AVPictureInPictureSampleBufferPlaybackDelegate {
    
    func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, setPlaying playing: Bool) {
        
    }
    
    func pictureInPictureControllerTimeRangeForPlayback(_ pictureInPictureController: AVPictureInPictureController) -> CMTimeRange {
        .init(start: .zero, duration: self.buffers.first?.duration ?? .indefinite)
    }
    
    func pictureInPictureControllerIsPlaybackPaused(_ pictureInPictureController: AVPictureInPictureController) -> Bool {
        false
    }
    
    func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, didTransitionToRenderSize newRenderSize: CMVideoDimensions) {
        
    }
    
    func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, skipByInterval skipInterval: CMTime, completion completionHandler: @escaping () -> Void) {
        
    }

    @IBOutlet weak var playerView: View!
    
    @IBOutlet weak var textView: UITextView!
    
    var pipController: AVPictureInPictureController?
        
    var glimpse: Glimpse!
    
    var isRecording = false
    
    var buffers = [CMSampleBuffer]()
    
    @IBAction func pip() {
        pipController?.startPictureInPicture()
    }
    
    func startRecording() {
        glimpse = Glimpse()
        glimpse.startRecording(textView, withCallback: { url in
            if let url = url {
                do {
                    DispatchQueue.main.async {
                        (self.playerView.layer as! AVSampleBufferDisplayLayer).flush()
                        
                        if self.pipController == nil {
                            self.pipController = AVPictureInPictureController(contentSource: .init(sampleBufferDisplayLayer: self.playerView.layer as! AVSampleBufferDisplayLayer, playbackDelegate: self))
                            self.pipController?.requiresLinearPlayback = true
                        }
                    }
                    
                    let reader = try AVAssetReader(asset: AVAsset(url: url))
                    let output = AVAssetReaderTrackOutput(track: reader.asset.tracks.first!, outputSettings: nil)
                    reader.add(output)
                    reader.startReading()
                    
                    while let buffer = output.copyNextSampleBuffer() {
                        self.buffers.append(buffer)
                    }
                    
                    try FileManager.default.removeItem(at: url)
                } catch {
                    print(error)
                }
            }
        })
        
        isRecording = true
    }

    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        
        var i = 0
        _ = Timer.scheduledTimer(withTimeInterval: 1, repeats: true, block: { _ in
            self.textView.text += "Hello World! (\(i))\n"
            if self.isRecording {
                self.glimpse.stop()
                self.startRecording()
            }
            i += 1
        })
        
        let layer = playerView.layer as! AVSampleBufferDisplayLayer
        layer.requestMediaDataWhenReady(on: .global()) {
            if let buffer = self.buffers.first {
                layer.enqueue(buffer)
                self.buffers.remove(at: 0)
            }
        }
        
        startRecording()
    }
}

In this example, I modify the content of a UITextView every second and I record a video of it. Then I extract the CMSampleBuffers to display them in the AVSampleBufferDisplayLayer.

I attached two screenshots, the first shows how the content of the text view is successfully shown in the AVSampleBufferDisplayLayer and the second shows how nothing is displayed when PIP is enabled.

PIP Disabled

PIP Enabled

What am I doing wrong?

Upvotes: 0

Views: 2324

Answers (2)

enodev
enodev

Reputation: 81

I have experienced the same behavior when returning incorrect time range for playback. Make sure you return .positiveInfinity for duration otherwise your layer will be covered with the loading indicator.

    func pictureInPictureControllerTimeRangeForPlayback(_ pictureInPictureController: AVPictureInPictureController) -> CMTimeRange {
        return CMTimeRange(start: .negativeInfinity, duration: .positiveInfinity)
    }

Documented here:

https://developer.apple.com/documentation/avkit/avpictureinpicturesamplebufferplaybackdelegate/3750337-pictureinpicturecontrollertimera?changes=la

Upvotes: 6

palmin
palmin

Reputation: 297

I have something like this working in Secure ShellFish and it made a difference how the CMSampleBuffers was created.

I had to create the CMSampleBuffer from a CVPixelBuffer that was IOSurface compatible and I had to mark the CMSampleBuffer with kCMSampleAttachmentKey_DisplayImmediately.

Upvotes: 1

Related Questions