Thumbnail

Description

Due to the rapid growth of information and data in the last decade, a new mindset has emerged – one in which we see information as more than just an idea, but as a precise mathematical unit. This new mindset has led to the development of Information Theory, a set of ideas that studies the quantification, storage and communication of information - a theory from which the internet was developed.  

This course will cover the basic concepts of information theory, before going deeper into areas like entropy, data compression, mutual information, capacity and applications to statistics and machine learning.

NOTE: This course was formerly EE 376A.

Prerequisites

EE178 or STATS116 or equivalent.

Topics include

  • Entropy and mutual information
  • Source coding theorem and Huffman code
  • Universal compression and distribution estimation
  • Channel capacity and noisy channel coding theorem
  • Polar codes, Gaussian channels and continuous random variables
  • Maximum entropy principle and applications to hypothesis testing

Course Availability

The course schedule is displayed for planning purposes – courses can be modified, changed, or cancelled. Course availability will be considered finalized on the first day of open enrollment. For quarterly enrollment dates, please refer to our graduate education section.