Adelson Araújo
Adelson Araújo

Reputation: 342

How can I track eye movements with Google Video Intelligence API?

I have a video with 3 persons speaking and I would like to annotate the location of people's eyes during it. I know that the Google Video Intelligence API has functionalities for object tracking, but it's possible to handle such an eye-tracking process using the API?

Upvotes: 0

Views: 478

Answers (2)

Nick_Kh
Nick_Kh

Reputation: 5243

Google Video Intelligence API represents Face detection feature, which gives you opportunity to perform face detection from within video frames as well as special face attributes.

In general, you need to adjust FaceDetectionConfig throughout videos.annotate method, supplying includeBoundingBoxes and includeAttributes arguments in JSON request body:

{
   "inputUri":"string",
   "inputContent":"string",
   "features":[
      "FACE_DETECTION"
   ],
   "videoContext":{
      "segments":[
         "object (VideoSegment)"
      ],
      "faceDetectionConfig":{
         "model":"string",
         "includeBoundingBoxes":"true",
         "includeAttributes":"true"
      }
   },
   "outputUri":"string",
   "locationId":"string"
}

Upvotes: 1

Cloudkollektiv
Cloudkollektiv

Reputation: 14699

There is a detailed (Python) example from Google on how to track objects and print out detected objects afterward. You could combine this with the AIStreamer live object tracking feature, to which you can upload a live video stream to get results back.

Some ideas/steps you could follow:

  1. Recognize the eyes in the first frame of the video.
  2. Set/highlight a box around the eyes you are tracking.
  3. Track the eyes as an object in the next frames.

Upvotes: 1

Related Questions