Anim
Anim

Reputation: 109

Emotion Detect on live screen

Trying to build a system to detect emotion in online meeting using python. I have created a deep learning model to detect emotions.

Using this code to view the screen

import time
import cv2
import mss
import numpy


with mss.mss() as sct:
    monitor = {"top": 0, "left": 0, "width": 1000, "height": 1000}
    
    while "Screen capturing":
        last_time = time.time()

        img = numpy.array(sct.grab(monitor))

        cv2.imshow("OpenCV/Numpy normal", img)


        if cv2.waitKey(25) & 0xFF == ord("q"):
            cv2.destroyAllWindows()
            break

Is there any way I can connect my model with this screen and see the emotions on this live screen?

My output should be face on the screen along with emotion text on it.

Upvotes: 0

Views: 147

Answers (1)

helloworld
helloworld

Reputation: 162

You can pass the img variable to your model and get the emotions (as text) during inference. After that, it's a matter of updating the image by using OpenCV putText and then displaying this updated image using imshow.

img = numpy.array(sct.grab(monitor))
detected_emotion = inference(img)    # model infers emotion here

image = cv2.putText(img, 'detected_emotion', org=(50, 50), font=cv2.FONT_HERSHEY_SIMPLEX, fontScale=1, color=(255, 0, 0), thickness=2, cv2.LINE_AA)
 
cv2.imshow("OpenCV/Numpy normal", image)

Upvotes: 2

Related Questions