BONIFACI VINCENZO
(syllabus)
1. Introduction to information theory. Reliable transmission of information. Shannon's information content. Measures of information. Entropy, mutual information, informational divergence. Data compression. Error correction. Data processing theorems. Fundamental inequalities. Information diagrams. Informational divergence and maximum likelihood.
2. Source coding and data compression Typical sequences. Typicality in probability. Asymptotic equipartitioning property. Block codes and variable length codes. Coding rate. Source coding theorem. Lossless data compression. Huffman code. Universal codes. Ziv-Lempel compression.
3. Channel coding Channel capacity. Discrete memoryless channels. Information transmitted over a channel. Decoding criteria. Noisy channel coding theorem.
4. Further codes and applications Hamming space. Linear codes. Generating matrix and check matrix. Cyclic codes. Hash codes.
(reference books)
David J. C. MacKay. Information Theory, Inference and Learning Algorithms. Cambridge University Press, 2004.
|