biswajitGhosh
biswajitGhosh

Reputation: 147

How to implement DEEPExpectation (DEX) age detection?

First of all, I'm new to the Deep Learning platform, please correct me if I did any mistake.

I'm trying to implement the age detection by using DEX method. As of now my understand is that, they tried to train a CNN weight model using VGG-16 architechture. I'm using IMDB_WIKI dataset as they suggest in their paper.

I'm using TensorFlow, Keras to train my weight model in Python3 language.

My Steps to train the model(I just starts with the IMDB set):

  1. Load the IMDB mat file and get training data and validation data set(10% of total dataset)
  2. Create a VGG-16 model with ImageNet weight(I belive its a large dataset)
  3. As ImageNet have 1000 classes, remove the last layer of model and put my single age class output layer instead.
  4. Also add a dropout layer on the top of output layer(frankly don't know how it is working)

My experiment start from here :)

  1. Freeze the layers alreay pretrain into VGG-16 architechture, except my new added layers, now there are some non-trainable objects available. In that case my training age accuracy is just 19% which is too poor, i hope detecting real age it should be 50-56%.
  2. By seeing that I guess that may be due to, I didn't train all the layers. I remove the freezeness of the layers and tried to train but it's showing me a out of memory exception, After that I have just freeze 8 layers of my total architecture, after training 40 epochs I found that age accuracy is 11% which is less that before :(

Can any one please help me to understand this paper properly, please ?

Thank you.

Upvotes: -2

Views: 257

Answers (2)

sefiks
sefiks

Reputation: 1665

It is already implemented in deepface package for python

#!pip install deepface
from deepface import DeepFace
obj = DeepFace.analyze("img1.jpg", actions = ["age", "gender"])
print(obj)

The model is trained based on the instructions of DEX paper. It builds a VGG model in the background and loads pre-trained weights. Besides, it runs on TensorFlow framework and Keras APIs.

Upvotes: 0

Thomas Pinetz
Thomas Pinetz

Reputation: 7148

"Also add a dropout layer on the top of output layer(frankly don't know how it is working)" - That is just plainly wrong. A dropout layer sets multiplies the output with 0, making the activation and the gradient 0. If you use this as your final layer with k percent, then your result will be rubbish in k percent of cases, e.g. dropping your accuracy. Just remove it and it should be better.

Upvotes: 1

Related Questions