Reputation: 83
I need to send via client/server communication (android to pc) differents frames captured from the back camera of the phone. Basically, I don't need to register the video and save it to my local device, I only need to start the camera and then, when I receive a custom callback made by me, I need to take the current frame and convert it to an array of bytes(?) to send it to my computer and then turn it back to an image with openCV, for machine learning analysis.
What's a possible way to achieve this? Should I use the Camera library, the surface view or there are better ways? What's the way to get the current frame showed in the camera preview?
Upvotes: 2
Views: 1563
Reputation: 4365
The easiest way to implement this would be using CameraX.
Other (manual) way of implementing this would be attaching a SurfaceTexture
as the camera stream consumer, attach a consumer to it (Surface), implement appropriate image decoding and then pass the result to the OpenCV module, e.g. using TextureView
and retrieving frames from listener's onSurfaceTextureUpdated method, getting bitmap from OpenGL texture afterwards.
Upvotes: 1