Course objective is to present fundaments of Information Theory and its applications to communication theory and computer science such as compression, security, complexity. Error correcting codes and advanced coding techniques will be also covered, such as trellises and graph codes.
1. Introduction
---------------
1.1. Entropy, relative entropy, mutual information
1.2. Asymptotic equipartition property
1.3. Entropy rates of a stochastic process
2. Data compression
-------------------
2.1 Block, symbol, stream codes, codes for integers
2.2 Huffman, LZ, arithmetic coding, Shannon-Fano codes
3. Shannon coding theory
---------------------
3.1. Kraft inequality, Shannon's source coding theorem
3.2. Channel capacity (jointly typical sequences, Fano's inequality, Shannon's channel coding theorem and its converse)
4. Differential entropy
-----------------------
4.1. Defintion of differential entropy
4.2. Relationship with discrete entropy and properties
5. Gaussian channels
---------------------
5.1. Coding theorem for gaussian channels
5.2. Parallel, colored and feedback gaussian channels
5.3. Error correcting codes
6. Rate-distortion theory
-------------------------
6.1. Quantization
6.2. Rate distortion theorem and rate distortion function
7. Advanced Coding
------------------
7.1. Advanced coding (Hash, binary codes)
7.2. Exact marginalization in trellises and graphs
7.3. Sparse graph codes
Author | Title | Publisher | Year | ISBN | Note |
T. M. Cover, J. A. Thomas | Elements of Information Theory (Edizione 1) | John Wiley & Sons, Inc. | 1991 | 0471062596 | Testo principale |
David J.C. MacKay | Information theory, inference and learning algorithms (Edizione 1) | Cambridge University Press | 2003 | Testo secondario - disponibile online |
1. Written exam
2. Oral project defense
******** CSS e script comuni siti DOL - frase 9957 ********p>