Skip to content

shining0611armor/Deep-Generative-Models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

38 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🎓 Deep Generative Models (DGM) – Spring 1404 / 2025

K. N. Toosi University of Technology
Instructor: Dr. B. Nasihatkon
Course Level: Master’s & PhD
Head Teaching Assistant and Designer: Mehran Tamjidi


📘 Course Overview

Welcome to the official repository for the graduate-level course Deep Generative Models (DGM), offered in Spring 1404 (2025) at K. N. Toosi University of Technology. This course explores the mathematical foundations and modern deep learning techniques underlying generative modeling, with a strong emphasis on probabilistic reasoning, graphical models, and state-of-the-art architectures for image and data generation.

This repository contains all course materials and homework assignments, designed to guide students through foundational theory, structured probabilistic models, and cutting-edge generative algorithms using PyTorch.


📦 Included Modules & Assignments

🧮 Homework 1 – Probability and Statistical Foundations

Lay the groundwork with key probabilistic and statistical concepts required for understanding generative models:

  • 📍 Modeling Stone Drops in a Cylindrical Well – Analyze continuous distributions and transformations.
  • 🔗 Joint, Marginal, and Conditional Distributions – Discrete variable modeling and normalization via the Riemann zeta function.
  • 📊 Testing Independence – Use tabulated PMFs to assess variable relationships.
  • 📐 Parameter Counting – Quantify the degrees of freedom in discrete models.
  • 🔄 Conditional Independence Proofs – Solidify your grasp of graphical model semantics.

🧠 Homework 2 – Bayesian Network-Based Generation of Persian Digits

Implement generative models using Bayesian Networks:

  • 🖼️ Binary Persian Digit Dataset – Preprocessing and format handling.
  • 🕸️ Bayesian Network Construction – Pixel-wise CPDs using 3-, 8-, and 15-connected structures.
  • 📊 Linear Sigmoid CPDs – Shared parameterization across the image grid.
  • 💻 Deliverables – PyTorch implementations with well-documented code and generated digit samples.

🔍 Homework 3 – Variational Inference & Autoregressive Modeling

Tackle both inference and generation with modern deep learning:

  • 📐 Deriving ELBO – For fully factorized variational posteriors in Bayes Nets.
  • 🤖 Lightweight Image GPT – Autoregressive generation of Persian digit images with resource-efficient techniques.
  • 🧮 PyTorch Implementation – Modular code with interpretable sampling and inference.

🌌 Homework 4 – Normalizing Flows & Latent Space Manipulation

Dive into latent-variable generative models and 3D data processing:

  • 🎨 Image Generation with GLOW – Attribute transfer, noise injection, and embedding-space editing.
  • 🌀 3D Point Cloud Denoising – Apply flow-based denoising to corrupted ModelNet-40C point clouds.
  • 🛠️ End-to-End PyTorch Code – Focused on reproducibility and experimentation.

🔋 Homework 5 – Energy-Based Models, JEM++, DVAEs & GANs

Explore advanced generative modeling paradigms through deep energy formulations, discrete representations, and adversarial training:

  • Energy-Based Models (EBMs) – Train scalar-valued networks E(x) and E(x, y) using Stochastic Langevin Dynamics and contrastive divergence. Compare unconditional and class-conditional sampling on CIFAR-10 and MNIST.
  • 🔁 Replay Buffer & Noise Scheduling – Investigate initialization strategies (random vs. replay buffer) and noise annealing using exponential schedules.
  • 🔐 JEM & JEM++ – Turn a classifier into a joint energy model. Use a mix of cross-entropy and generative losses with SGLD. JEM++ adds GMM-based initialization and Proximal SGLD for improved sampling stability.
  • 🔢 Discrete Variational Autoencoders (DVAEs) – Quantize latent space using vector quantization, apply KMeans clustering, and analyze using silhouette scores, confusion matrices, and t-SNE plots.
  • 🇮🇷 GANs for Persian Digits – Train and compare vanilla GAN and Wasserstein GAN (WGAN) on binary Persian digits. Assess image quality, loss convergence, and sample fidelity.

Each section is implemented using PyTorch and includes visualizations, quantitative evaluations, and reproducible experiment code.


🛠️ Technologies & Tools

  • Python 3.10+
  • PyTorch
  • NumPy / SciPy / Matplotlib
  • Jupyter Notebooks
  • TorchVision & TorchMetrics

🎯 Learning Objectives

By the end of this course, students will be able to:

  • Understand and apply the probabilistic foundations of generative modeling.
  • Design and implement Bayesian networks and variational inference techniques.
  • Construct autoregressive models for structured data generation.
  • Utilize normalizing flows for high-quality image synthesis and latent space exploration.
  • Apply energy-based modeling and adversarial frameworks for advanced generation tasks.

📚 Academic Integrity

All work submitted must be your own. Discussions and collaborations are encouraged for conceptual understanding, but any code or write-up must be written individually unless explicitly allowed.

📫 Contact

Feel free to reach out if you have any questions or suggestions:

Happy Learning! 😊


🔖 Citation

If you find the course material useful for academic or educational purposes, please cite the course as:

Deep Generative Models (DGM), Dr. B. Nasihatkon, K. N. Toosi University of Technology, Spring 1404 (2025).

About

This repository contains the course materials for our Deep Generative Models course at K. N. Toosi University of Technology in Spring 2025.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors