This course teaches the fundamentals of Information Theory. The topics covered in this course are as follows:
1) Fundamental quantities and concepts in Information Theory: entropy, Kullback-Leibler divergence, mutual information and Jensen's inequality, Fisher information and the Cramer-Rao bound.
2) Lossless data compression: uniquely decodable and instantaneous source codes, Kraft's inequality, analysis of the optimal codeword length, Huffman codes, and universal compression.
3) Information theory and machine learning: EM algorithm, variational autoencoders, diffusion models, decision trees.