This course teaches the fundamentals of Information Theory. The topics covered in this course are as follows:
1) Fundamental quantities and concepts in Information Theory: entropy, Kullback-Leibler divergence, mutual information and Jensen's inequality.
2) Lossless data compression: uniquely decodable and instantaneous source codes, Kraft's inequality, analysis of the optimal codeword length, Huffman codes, and universal compression.
3) Information theory and machine learning: Generalization error, empirical risk minimization, classical statistical learning generalization guarantees, information theoretic generalization bounds.