Computer Science: Deep Generative Models

Description

Generative models are a key paradigm for probabilistic reasoning within graphical models and probabilistic programming languages. It is one of the exciting and rapidly-evolving fields of statistical machine learning and artificial intelligence. Recent advances in parameterizing generative models using deep neural networks, combined with progress in stochastic optimization methods, have enabled scalable modeling of complex, high-dimensional data including images, text, and speech. In this course, we will study the probabilistic foundations and learning algorithms for deep generative models and discuss application areas that have benefitted from deep generative models.

What you will learn

  • A powerful way of learning data distribution
  • How to apply various algorithms to decision making, finding analogies, and predicting future events
  • Various applications to deep generative models including computer vision, speech and language processing

Prerequisites

Basic knowledge about machine learning from at least one of CS 221, 228, 229 or 230. Students will work with computational and mathematical models and should have a basic knowledge of probabilities and calculus. Proficiency in some programming language, preferably Python, required.

Topics include

  • Autoregressive models
  • Variational autoencoders
  • Normalizing flow models
  • Generative adversarial networks
  • Energy-based models

Notes

Course Availability

The course schedule is displayed for planning purposes – courses can be modified, changed, or cancelled. Course availability will be considered finalized on the first day of open enrollment. For quarterly enrollment dates, please refer to our graduate education section.