Teacher
|
CAMPISI PATRIZIO
(syllabus)
Elements of information theory: entropy of a source, relative entropy. Joint entropy and conditional entropy. Sufficient statistics. Lossless source coding: Optimal codes. Codeword length limits for optimal codes. Kraft inequality for uniquely decodable codes. Huffman and Shannon-Fano-Elias encoders. Universal source coding. Arithmetic encoders. Lempel-Ziv encoder. Equivocation, mutual information rate, channel capacity. Capacitance of symmetric binary channels and band-limited channels affected by additive gauan noise. Shannon's theorem on channel coding. Fano inequality. Separation theorem between source coding and channel coding. Linear block codes: definition, generating matrix, parity checks, systematic codes Error detection and correction for linear block codes. Syndrome. Dual code of a linear block code. Excellent decoder. Error detection and correction for symmetric binary channels. Standard deployment. Performance. Galois fields: definitions and properties. Cyclic codes. Hamming codes. Reed-Solomon codes. Convolutional codes: definitions and properties. Maximum likelihood decoding: symmetric binary channels and Gaussian channels Markov series: definitions and properties. Viterbi algorithm: principle, implementation and performance Viterbi algorithm: performance. Turbocodes: definitions and operating principle. Concatenated codes. Systematic recursive convolutional encoders. Interleavers for convolutional codes. Calculation of the posterior probability (AAP) for turbocodes. Operating principle of hybrid ARQ protocols. Turbo code decoders: decoding algorithm.
(reference books)
Elements of information theory Thomas M. Cover, Joy A. Thomas, 2. ed., 2006 John Wiley & Sons, Ltd.
|