FEO3350 Information Theory for Statistics and Learning, Fall 2020


Information theory, machine learning and artificial intelligence have been overlapping fields during their whole existence as academic disciplines. These areas, in turn, overlap significantly with applied and theoretical statistics.

Arguably the most central concepts in information theory are: entropy, mutual information and relative entropy (Kullback-Leibler divergence). These entities are important also in inference and learning, for example via their manifestation in the evidence lower bound (variational free energy). Entropy and mutual information also play important parts in a class of general bounds to error probability in estimation and decision-making, where the most basic special case is known as Fano's inequality. Relative entropy was introduced in parallel in the statistics and information theory literature, and is a special case of the more general concept of f-divergence. Divergence is in general an important measure of "statistical dissimilarity," and plays an fundamental part in several bounding techniques. A more recent framework that has caught considerable attention is the information bottleneck principle, which in turn has several interesting connections to traditional rate-distortion theory.

This course will explore these, and several other, relations and tools at some depth. The goal is to give PhD students in decision and control, learning, AI, network science, and information theory a solid introduction to how information-theoretic concepts and tools can be applied to problems in statistics, decision and learning well beyond their more traditional use in, for example, communication theory.

The course is registered as FEO3350 and is worth 12 cu's.


Teachers

Mikael Skoglund and Tobias Oechtering


Prerequisites

Required: Solid working knowledge (at the "advanced undergrad level") in analysis, linear algebra and probability

Recommended: Information theory, corresponding to FEO3210; (measure theoretic) probability, corresponding to FEO3230; optimization, corresponding to SF3847


Material

Teaching the course will draw from several different sources. The following is a partial list of recommended textbooks, tutorials, lecture-notes and papers.


Preliminary Schedule 2020-21

MS = Mikael Skoglund, TO = Tobias Oechtering
  • Lecture 1, November 6 [MS]: Information theory fundamentals: Entropy, mutual information, relative entropy, and f-divergence. Total variation and other distance metrics. Inequalities. [CT,PW,W]
  • Lecture 2, November 13 [MS]: Rate-Distortion theory: Cost versus information. Bounds. The Blahut algorithm. [CT,PW]
  • Lecture 3, February 12 (starting at 10:00, Q2) [MS]: Limits on information flow and processing: Conditional mutual information and relative entropy. Data processing inequalities. Sufficient statistic and the information bottleneck. Rate-distortion interpretation [CT,PW,W,GP]
  • Lecture 4, February 19 (starting at 15:00, Q2) [MS]: Foundations of statistical decision theory: Parameter estimation. Bayes and minimax risk. Binary hypothesis testing [PW,W,S]
  • Lecture 5, February 26 (starting at 10:00, Q2) [MS]: Information bounds on error probability and risk: Sample complexity. The mutual information method and rate-distortion. Fano inequalities. [W,PW,D]
  • Lecture 6, April 26 (starting at 10:00 over Zoom) [MS]: Learning and generalization: Information bounds on generalization error. VC dimension and complexity. [XR,BBL,D,V]
  • Lecture 7, May 3 (starting at 10:00 over Zoom) [MS]: Variational methods: Variational characterization of divergence, Donsker-Varadhan [PW,W]. Variational inference and the ELBO [MK]
  • Lecture 8, May 11 (10:00 over Zoom) [TO]: Classical estimation theory: Maximum likelihood, Fischer information, information bounds, Cramér-Rao, Hammersley-Chapman-Robbins. [CT,W,PW,D,S,MK]
  • Lecture 9, May 17 (14:00 over Zoom) [TO]: Packing, covering, Fano & minimax risk, metric entropy [W,D,Wa]
  • Lecture 10, May 24 (10:00 over Zoom) [TO]: Le Cam's method, Assouad's method, mutual information method continued. Density estimation. Functional estimation. [W,D,Wa]
  • Lecture 11, May 31 (13:00 over Zoom) [MS]: Dimension compression and denoising: Sparse denoising, compressed sensing, almost lossless analog compression [W,D,RWY,CRT,WV]
  • Lecture 12, June 8 (10:00 over Zoom) [TO]: The method of types [CT,CK,CS]
  • Lecture 13, June 14 (10:00, Q2) [TO]: Information theory and large deviations, Stein, Chernoff and Sanov. Total variation and hypothesis testing. [CT,CK,CS,PW]
  • Lecture 14, June 21 (10:00, Q2) [MS]: The geometry of information: Information geometry, information projection, iterative methods, Expectation-Maximization [C1,C2,CTu,CK,CS]

The first meeting (on November 6) is held in Room Q2 (Malvinas Väg 10), starting at 9:30

APR 16, 2021: We start again on April 26, over Zoom. We will give at least the next two lectures (6 and 7) over Zoom. The links are posted in the schedule above. For HW problems 5 and 6 (due April 26 and May 3), please scan/photograph/typeset your solutions and send in an email to Mikael: skoglund@kth.se. Please also indicate which problems you would have been prepared to present in class.

MAY 17, 2021: Our present assumption is that we will go back to physical meetings starting from Lec 11 (May 31). We will announce here at the latest on May 28 whether Lec 11 will be over Zoom or physical.

MAY 26, 2021: We have decided to give also Lecs 11 and 12 over Zoom. Most likely the final two lectures will be given over Zoom too. Decision to be announced here later.

JUNE 9, 2021: We will give the two final lectures physically, in room Q2. See the schedule above.


Downloads

Lecture slides and homework problems will be posted here.