SomeoneAlt-86
SomeoneAlt-86

Reputation: 67

Image Recognition Results are not printed in Swift Playgrounds

So I have been working on playground to recognize object in live capture but when I try to print the results, the results are not printed. Here is my code. I have also tried running step through my code and it just executes return in the guard let of the results. SetupLabel func can also not be executed as it then says that there is problem with the playground

class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {
// make a computed property based of the properties defined in the curly braces 
let label: UILabel = {
    let label = UILabel()
    label.textColor = .white
    label.translatesAutoresizingMaskIntoConstraints = false
    label.text = "Label"
    label.font = label.font.withSize(30)
    return label
}()



override func viewDidLoad() {
    super.viewDidLoad()
    setupCaptureSession()
    
}

func setupCaptureSession(){
    let captureSession = AVCaptureSession()
    
    // search for devices with specifications defined 
    let availableDevices = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: .video, position: .back).devices
    
    // setup capture device add input to captureSession
    do{
        if let captureDevice = availableDevices.first{
            let captureDeviceInput = try AVCaptureDeviceInput(device: captureDevice) 
            
            captureSession.addInput(captureDeviceInput)
        }
        
    }catch{
        print(error.localizedDescription)
    }
    
    // setup output and output to captureSession
    let captureOutput = AVCaptureVideoDataOutput()
    captureOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "videoQueue"))
    captureSession.addOutput(captureOutput)
    
    let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    previewLayer.frame = view.frame
    view.layer.addSublayer(previewLayer)
    
    captureSession.startRunning()
}

// called when a frame is captured 
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    
    guard let model = try? VNCoreMLModel(for: myYolo) else {return}
    
    
    let request = VNCoreMLRequest(model: model) { (finishedRequest, error) in
        print(finishedRequest.results)
        
        
        
        guard let results = finishedRequest.results as? [VNClassificationObservation] else { return }
        guard let Observation = results.first else { return }
        
        
        DispatchQueue.main.async(execute: {
            self.label.text = "\(Observation.identifier)"
        })
    }
    guard let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
    
    // executes request
    try? VNImageRequestHandler(cvPixelBuffer: pixelBuffer, options: [:]).perform([request])
}

func SetupLabel(){
    label.centerXAnchor.constraint(equalTo: view.centerXAnchor).isActive = true
    
    label.bottomAnchor.constraint(equalTo: view.bottomAnchor, constant: -50).isActive = true
    
    view.addSubview(label)
}
}

Upvotes: 0

Views: 107

Answers (1)

valosip
valosip

Reputation: 3402

You need to run it on a real device.
Vision requests will not work in playgrounds/simulator.

Upvotes: 2

Related Questions