Course: 2024/2025

Information Theory for Machine Learning

(19288)

Requirements (Subjects that are assumed to be known)

Students should have a solid basis in probability and calculus, as well as pleasure with mathematics.

This course teaches the fundamentals of Information Theory. Students will acquire a profound understanding of:
- Information-theoretic quantities, such as entropy, Kullback-Leibler divergence, and mutual information.
- Mathematical tools commonly used in Information Theory, such as Jensen's inequality.
- The concepts and fundamental theorems of data compression.
- The application of Information Theory in Machine Learning.

Skills and learning outcomes

Description of contents: programme

This course teaches the fundamentals of Information Theory. The topics covered in this course are as follows:
1) Fundamental quantities and concepts in Information Theory: entropy, Kullback-Leibler divergence, mutual information and Jensen's inequality, Fisher information and the Cramer-Rao bound.
2) Lossless data compression: uniquely decodable and instantaneous source codes, Kraft's inequality, analysis of the optimal codeword length, Huffman codes, and universal compression.
3) Information theory and machine learning: EM algorithm, variational autoencoders, diffusion models, decision trees.

Learning activities and methodology

TRAINING ACTIVITIES
AF3 Theoretical and practical classes
AF4 Laboratory practices
AF5 Tutoring
AF6 Group work
AF7 Individual work
AF8 Partial and final exams
TEACHING METHODS
MD1 - Class lectures by the professor with the support of computer and audiovisual media, in which the main concepts of the course are developed and complemented with bibliography.
MD2 - Critical reading of texts recommended by the professor of the course.
MD3 - Resolution of practical cases, problems, etc. .... posed by the teacher individually or in groups.
MD4 - Presentation and discussion in class, under the moderation of the professor, of topics related to the content of the course, as well as case studies.
MD5 - Elaboration of works and reports individually or in groups.
LECTURES
The basic concepts will be mainly taught at the blackboard. We will follow closely the book "Elements of Information Theory" by Cover & Thomas (see Basic Bibliography).
EXERCISES
In order to deepen the understanding of the taught material, every two weeks students have to hand in the solutions to a set of problems. These solutions will be graded from 1 to 10.
LABORATORIES
In order to reinforce the theoretical concepts learned in class, a laboratory exercise will be carried out on topics related to machine learning.
TUTORING
We establish 2 hours per week of tutoring sessions where the professor is available in his office.

Assessment System

- % end-of-term-examination 0
- % of continuous assessment (assigments, laboratory, practicals...) 100

Calendar of Continuous assessment

Basic Bibliography

- Diederik P. Kingma and Max Welling. An Introduction to Variational Autoencoders. Foundations and Trends© in Machine Learning. 2019
- Thomas M. Cover and Joy A. Thomas. Elements of Information Theory. Second Edition. 2006

Additional Bibliography

- Abbas El Gamal and Young-Han Kim. Network Information Theory. First Edition. 2011
- Imre Csiszár and János Körner. Information Theory: Coding Theorems for Discrete Memoryless Systems. Second Edition. 2011
- Robert G. Gallager. Information Theory and Reliable Communication. First Edition. 1968

The course syllabus may change due academic events or other reasons.