Information Theory

The course currently includes six modules (approx. 4 hours/module):

This is a graduate level introductory course in Information Theory where we will introduce the mathematical notion of information and justify it by various operational meanings. This basic theory builds on probability theory and allows us to quantitatively measure the uncertainty and randomness in a random variable as well as information revealed on observing its value. We will encounter quantities such as entropy, mutual information, total variation distance, and KL divergence and explain how they play a role in important problems in communication, statistics, and computer science. Information theory was originally invented as a mathematical theory of communication, but has since found applications in many areas ranging from physics to biology. In fact, any field where people want to evaluate how much information about an unknown is revealed by a particular experiment, information theory can help. In this course, we will lay down the foundations of this fundamental field.

Prof. Himanshu Tyagi from CNI offered this instructor-led course on NPTEL portal during July-Oct 2020:

The course lectures are available for self-study on the NPTEL portal.

Additionally, we have developed supplementary lectures to explain some of the key concepts in Information Theory in simpler terms.