Skip to content

amilkh/cs230-fer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

45 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Facial Expression Recognition

Research Paper

Our published paper can be found here: https://arxiv.org/abs/2004.11823

Mobile App

Our mobile app can be accessed here: http://cs230-fer.firebaseapp.com/
Note: you may run into permission issues with iPhones on the Safari browser

Introduction

Facial expressions are a universal way for people to communicate. This repository demonstrates several deep learning models for detecting emotions, including a five-layer convolutional network and transfer learning models.

We demonstrate a state-of-the-art accuracy of 75.8% on our best model, outperforming the highest reported 75.2% test accuracy in published works at the time of this writing [1].

Additionally, we apply our FER models to the real world with an on-device, real-time mobile web app.

Getting Started

The included Jupyter notebooks have all pre-requisities defined internally.

  1. Refer to our paper for implementation details and poster/video for a high-level overview.
  2. Download datasets as described in datasets/README.md.
  3. Run one of the Jupyter notebooks in the top-level directory

Intepretability

Occlusion-based Saliency Maps

Our web app model learned to focus on the mouth and nose to make predictions for disgust, mouth for happiness, and eyes and nose for surprise. For neutral images, it focused on all parts of the face except for the nose, which made sense given that small changes in non-nose regions tend to correspond to emotion changes.

Occlusion-based Saliency Map

Confusion Matrix

The highest misclassifications on our model were of true sad images predicted as neutral. To address this, we can further augment our web app training dataset with more properly labeled sad images.

Confusion Matrix

Video

video

Background

This was our final project for CS230 Deep Learning at Stanford. Stanford team

Acknowledgements

NED University

The mobile web application development was led by Assistant Professor Asma Khan and her team of undergraduate students in the Software Engineering department of NED University in Karachi, Pakistan. Muhammad Bilal Khan (app/hosting), Muhammad Hassan-ur-Rehman (app/hosting), and Tooba Ali (UI design). Muhammad Ashhad Bin Kashif and Summaiya Sarfaraz assisted with network interpretability. Muhammad Hasham Khalid and Midha Tahir worked on auxiliary dataset preparation, error analysis, and model tuning for the web app.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages