Derived from
|
20410138 IN420 - Information Theory in Computational Sciences LM-40 PEDICINI MARCO
(syllabus)
1) Introduction to information theory
- Error correction codes. Symmetrical binary channel. Block codes. Hamming codes. $ (7.4) $ - Hamming. The generating matrix of the code in the linear case. Decoding in the case of the $ (7.4) $ - Hamming code. Syndromes. Decoding by syndromes. The effectiveness of a given code. The channel capacity. Symmetrical codes. Graphic representation associated with a code.
- Probability. Probability spaces. Discrete probability. Probability in retrospect. Maximal likelihood principle. Definition of entropy. Information content according to Shannon. Redundancy. Joint Entropy. Decomposition rule for the entropy computation. Gibbs inequality. Jensen inequality.
- Inference.
([1] chapters 1, 2, 3)
2) Data Compression
- Source coding theorem. The measure of information content of a random variable. Raw information content. Information content and compression with the loss. Essential information content. Shannon's theorem. Sets of typicality. Asymptotic equipartition principle.
- Symbolic Codes. Encoding without loss of information. Prefix codes. Univocal decoding. Kraft inequality. Optimal codes. Huffman codes.
- Flow Codes. Arithmetic codes. Bayesian model. Huffman codes with a header. Arithmetic codes with Laplace predictive modelling of the distribution. Arithmetic codes with Dirichlet predictive model. Lempel-Ziv coding
([1] chap 4,5,6)
3) Channel coding in the presence of noise
- dependent random variables. Joint Entropy. Conditional Entropy. Mutual information. Conditioned mutual information.
- Communication on channels in the presence of noise. Discrete channel without memory (DMC). Examples: symmetrical binary channel (BSC), channel binary with cancellation (BEC), the teletype with noise (NT), the zeta-channel (Z). Information carried by a channel.
- The coding theorem of the source in the case with noise. Optimal decoding. Probability of error on the block and on average on the single bit. Typical sequences and sets of joint typicality. Decoding by means of typicality sets. Information rate in the case of use of a channel beyond capacity.
- Error correction codes and applications.
([1] chap 8,9,10,11)
4) Further codes
- Hash codes. - Linear Codes.
([1] chapt 12,14)
(reference books)
[1] David J. C. MacKay, Information Theory, Inference and Learning Algorithms. (2004) Cambridge University Press; [2] R. Blahut, Principles and Practice of Information Theory, (1987) Addison Wesley. [3] Thomas and Cover, Elements of Information Theory, (1991) Wiley;
|