Skip to content

BilinYang/Emotion_Detection

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Emotion Detection with Deep Learning Models

Real-time and video-based emotion recognition using deep learning (VGG/LeNet/GoogLeNet). Supports live webcam streams and pre-recorded videos.

🔧 Basic Setup

1️⃣ Fork the repository

Fork this repository and enter the root folder (which should be named Emotion_Detection)

2️⃣ Create Conda Environment

conda create -n <venv_name> python=3.9.22
conda activate <venv_name>

3️⃣ Install Dependencies

Core Requirements:
  • OpenCV 4.5+
  • TensorFlow 2.6+
  • Other packages from requirements.txt
pip install -r requirements.txt
💡 Conda Tip: For GPU support, install TensorFlow with Conda:
conda install -c conda-forge tensorflow-gpu

 

🚀 Quick Start Guide

Prerequisites:
  • You are in the project root directory
  • Environment is set up (requirements.txt installed)

 

1️⃣ Built Dataset

python emo_rec/build_dataset.py
📁 Output:
  • The training, validation, and testing datasets will be stored under datasets/fer2013/hdf5

 

2️⃣ Train a Model

Option A: Specify the Model You Want to Train

python emo_rec/train_emotion_detector.py -m <model_name>

Available Models:

  • emotionvggnet (default)
  • lenet
  • minigooglenet
  • minivggnet
  • shallownet

Option B: Train Default Model (emotionvggnet)

python emo_rec/train_emotion_detector.py
📁 Outputs:
  • Trained models: emo_rec/built_models/
  • Training logs: emo_rec/training_logs/

 

3️⃣ Test a Model

python emo_rec/test_emotion_detector.py -m <model_name>
❗ Requirements:
  • Model must be trained first (Step 1)
  • -m flag is mandatory

Example:

python emo_rec/test_emotion_detector.py -m minivggnet

4️⃣ Run Emotion Detection

Option A: Process Video File

python emo_rec/run_emotion_detector.py -m <model_name> -v <video_path>

Example (to run the demo video already provided):

python emo_rec/run_emotion_detector.py -m emotionvggnet -v emo_rec/video/example.mp4

Option B: Real-Time Webcam

python emo_rec/run_emotion_detector.py -m <model_name>
🖥️ Controls:
  1. Select the camera window
  2. Press:
    • Mac: Cmd + Q
    • Windows: Ctrl + Q

Example:

python emo_rec/run_emotion_detector.py -m lenet
🔴 Important:
  • Model must be trained first (Step 1)
  • -m <model_name> is always required
  • Uses default camera device

About

Real-time and video-based emotion recognition using deep learning (VGG/LeNet/GoogleNet). Supports both live webcam streams and pre-recorded videos.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages