NumesSanguis
NumesSanguis

Reputation: 6292

Python scikit regression PCA on faces

I have a dataset with faces showing the emotion happy. Every image has a percentage (integer values) of how happy the face is, ranging from 0-100% (0 being neutral and 100 maximum happy). I would like to apply PCA on this to reduce the dimensionality to later on apply machine learning, but I'm wondering how I should approach this.

my code so far:

import os
import cv2
import numpy as np
from sklearn.decomposition import PCA

folder = os.path.join('..','data','diff_face_s')


#Holds per emotion the data
class diffPCA():
    def __init__(self, emo):
        self.emo = emo
        self.data = np.empty([1,270,270], dtype=np.uint8)
        self.pers = [0]
        self.perc = [0]
        #PCA
        self.pca = PCA(n_components = 2)

    #Add flattened image    
    def process(self, img, pers, perc):
        #img: diff_face, pers: person, perc: percentage
        img_raw = cv2.imread(os.path.join(folder, img), 0)
        img_flat = img_raw.flatten()
        self.data = np.vstack(img_flat)
        self.pers.append(pers)
        self.perc.append(perc)

    def doPCA(self):
        self.pca.fit(self.data)

    def printPCA(self):
        print(self.pca.explained_variance_ratio_)


#Emotions
happy = diffPCA(1)

for img in os.listdir(folder):
    print(img)
    #name
    #perc
    #pers
    #if name starts with 1:
    happy.process(img, perc, pers)

happy.doPCA()
happy.printPCA()

Questions:

What is the best approach to reduce the dimensionality of the images based on the percentage? Can I just make a target list with percentages? Do I have 100 classes (1 for every percentage)?

Upvotes: 0

Views: 316

Answers (1)

eickenberg
eickenberg

Reputation: 14377

This example from the gallery may be helpful to obtain an idea of how to go about it.

Upvotes: 1

Related Questions