Skip to content

built model capable of detecting emotion/mood of person based on image of face

Notifications You must be signed in to change notification settings

pr1ya09/emotion-detect

Repository files navigation

Emotion recognition using facial expressions

As the name suggests this project is meant for leveraging the computer with the ability of classifying seven basic emotions using the facial expressions of humans.The seven basic emotions we're gonna classify are:

  • Happy
  • Sad
  • Angry
  • Disgust
  • Fear
  • Surprise
  • Contempt

Dataset

I have used the extended cohn kanade (CK+) dataset which can be found here

Reference Paper

The reference paper used is this reasearch paper published on springer.

Preprocessing

Performed the following preprocessing methods:

CNN model

Will be making a Sequential model comprising of using two convolutional layers (conv2D), two MaxPooling2D layers , a Flatten layer followed by a output Dense layer with softmax activation , with adam optimizer and categorical crossentropy loss.

Evaluation

Evaluated the model on three different cropping methods:

  • Cropping with background
  • Cropping without background
  • Cropping without forehead

Also varied the neuron number of hidden dense layer as 0, 256, 512, 1024. And performed a ten fold cross validation on the model keeping the cropping method fixed (without background) but varying the neuron number.

Code

The link to the whole assembled code is here.

How to run

For running the model , just run the python script facial_expression_recognition.py

Results

The results of various evaluation methods used are illustrated in this table :

Cropping without background

No. of neurons Accuracy graph Confusion matrix
0 Link Link
256 Link Link
512 Link Link
1024 Link Link

Cropping without forehead

No. of neurons Accuracy graph Confusion matrix
0 Link Link
256 Link Link
512 Link Link
1024 Link Link

Cropping with background

No. of neurons Accuracy graph Confusion matrix
0 Link Link
256 Link Link
512 Link Link
1024 Link Link

Cross Validation

Done a ten-fold cross validation on our dataset. Following the research paper , it was done with same cropping method (without background) but with different neuron numbers.

No. of neurons Accuracy
0 97.96
256 98.27
512 96.62
1024 97.14

Average accuracy

On the paper On our Model
97.38 97.49

Reference Links

About

built model capable of detecting emotion/mood of person based on image of face

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published