Checking date: 30/04/2019


Course: 2019/2020

Information Theory
(18539)
Master in Advanced Communications Technologies (Plan: 436 - Estudio: 278)
EPI


Coordinating teacher: KOCH , TOBIAS MIRCO

Department assigned to the subject: Signal and Communications Theory Department

Type: Electives
ECTS Credits: 6.0 ECTS

Course:
Semester:




Requirements (Subjects that are assumed to be known)
STUDENTS ARE EXPECTED TO HAVE COMPLETED Students should have a solid basis in probability and calculus, as well as pleasure with mathematics. Having taken a course on Digital Communications / Communication Theory is also helpful.
Objectives
COMPETENCES AND SKILLS THAT WILL BE ACQUIRED AND LEARNING RESULTS. This course teaches the fundamentals of Information Theory, including the basic source coding and channel coding theorems. Students will acquire a profound understanding of: - the concepts of data compression/transmission in digital communication systems. - the fundamental limits of source codes and error correcting codes. - information-theoretic quantities, such as entropy, Kullback-Leibler divergence, and mutual information. - mathematical tools/concepts commonly used in Information Theory, such as Jensen's inequality, Fano's inequality, and the Asymptotic Equipartition Property (AEP).
Description of contents: programme
DESCRIPTION OF CONTENTS: PROGRAMME This course teaches the fundamentals of Information Theory, which concerns data compression and transmission in digital communication systems. The topics covered in this course are as follows: 1) Fundamental quantities and concepts in Information Theory: entropy, Kullback-Leibler divergence, mutual information, Jensen's inequality, Fano's inequality, Asymptotic Equipartition Property (AEP), method of types. 2) Data compression: uniquely decodable and instantaneous source codes, Kraft's inequality, analysis of the optimal codeword length, Huffman codes, almost lossless source coding. 3) Data transmission: description of the information-theoretic communication system, channel capacity, Kuhn-Tucker conditions, the channel coding theorem, the joint source-channel coding theorem. 4) Data transmission over the Gaussian channel: differential entropy, entropy-maximizing property of Gaussian random variables, the channel capacity of the Gaussian channel.
Learning activities and methodology
LEARNING ACTIVITIES AND METHODOLOGY Lectures: The basic concepts will be mainly taught at the blackboard. We will follow closely the book "Elements of Information Theory" by Cover & Thomas (see Basic Bibliography). Exercises: In order to deepen the understanding of the taught material, every two weeks students have to hand in the solutions to a set of problems. These solutions will be graded from 1 to 10, the average grade over the whole semester will constitute the grade of the continuous assessment. Both lectures and exercises will be held in English.
Assessment System
  • % end-of-term-examination 60
  • % of continuous assessment (assigments, laboratory, practicals...) 40

Basic Bibliography
  • Thomas M. Cover and Joy A. Thomas. Elements of Information Theory. Second Edition. 2006
Additional Bibliography
  • Abbas El Gamal and Young-Han Kim. Network Information Theory. First Edition. 2011
  • Imre Csiszár and János Körner. Information Theory: Coding Theorems for Discrete Memoryless Systems. Second Edition. 2011
  • Robert G. Gallager. Information Theory and Reliable Communication. First Edition. 1968

The course syllabus may change due academic events or other reasons.