This course teaches the fundamentals of Information Theory, which concerns data compression and transmission in digital communication systems. The topics covered in this course are as follows:
1) Fundamental quantities and concepts in Information Theory: entropy, Kullback-Leibler divergence, mutual information, Jensen's inequality, Fano's inequality, Asymptotic Equipartition Property (AEP), method of types.
2) Data compression: uniquely decodable and instantaneous source codes, Kraft's inequality, analysis of the optimal codeword length, Huffman codes, almost lossless source coding.
3) Data transmission: description of the information-theoretic communication system, channel capacity, Kuhn-Tucker conditions, the channel coding theorem, the joint source-channel coding theorem.
4) Data transmission over the Gaussian channel: differential entropy, entropy-maximizing property of Gaussian random variables, the channel capacity of the Gaussian channel.