Reputation: 15
I am building a backend program for an educational web application the frontend of which displays a video of something (could be some video on the server), and simultaneously uses the webcam to record the user's responses to the video. After this, there is some processing on the server side. I would like to port the frontend to iOS. I did a few SO searches, but I am not clear if they answer this exact question. Please do let me know if there is a way to record from the iOS device front camera simultaneously when playing another video on the screen? My question relates to both the technical feasibility and Apple's rules regarding this as well.
If this is a duplicate, please let me know the question it is a duplicate of, since I haven't been able to find a good answer to this. Thanks!
Harish.
Upvotes: 1
Views: 678
Reputation: 565
There should be no issue with doing this, however you would need to monitor performance as the camera is a heavy drain on the iOS processor and memory, as are videos.
For both types of activity, you could use the AV Foundation framework provided as part of the iOS SDK. You would provide your video onscreen, and you would capture the video from the screen in a separate part of your application (they are two relatively unrelated areas - the only thing that really matters would be to tell the camera how long to record for to sync the two up). To capture the video from the camera, while not putting it onscreen, you would need to manually monitor the media returned from the camera in an AVCaptureSession.
Upvotes: 1