Teacher
|
NERI ALESSANDRO
(syllabus)
Elements of Information Theory: Entropy, Relative Entropy, and Mutual Information. Joint Entropy and Conditional Entropy. Sufficient Statistics. Fano's Inequality. Lossless source coding. Optimal Codes. Bounds on the Optimal Code Length. Kraft Inequality for Uniquely Decodable Codes. Huffman Codes. Optimality of Huffman Codes. Shannon-Fano-Elias Coding. Universal Source Coding. Universal Coding for Binary Sequences. Arithmetic Coding. Lempel-Ziv Coding. Mutual Information and equivocation. Channel Capacity. Properties of Channel Capacity. Examples of Channel Capacity: Binary Symmetric and Gaussian Channels. Channel Coding Theorem. Fano's Inequality and the Converse to the Coding Theorem. Source-Channel Separation Theorem. Rate Distortion Theory. Scalar Quantization. Max-Lloyd quantizer. Vector Quantization. Linde, Buzo, and Gray algorithm. Audio signals coding. Characteristics of the human auditory sytem, perception of complex audio signals, absolute threshold of hearing, critical bandwith, Anatomy and physiology of the basilar membrane. Masking mechanisms. Speech coding in the time domain: G.711, G.726 e G.727. Model based coding: G729. Audio coding in the transform domain: MPEG 1 layer I, II e III, MPEG-2 and MPEG-4. Image coding elements: the Discrete Cosine Transform. JPEG encoding. Channel Coding. Linear Block Codes. Generator Matrix. Block Codes in Systematic Form Parity Check Matrix. Syndrome Error Detection. Minimum Distance of a Block Code. Error-Correction Capability of a Block Code. Syndrome Detection and the Standard Array. Hamming Codes. Reed-Solomon Codes. Convolutional Codes. Convolutional Encoder Representations. Distance Properties of Convolutional Codes. Maximum Likelihood Detection. Markov chains. Decoding of Convolutional Codes: The Viterbi Algorithm. Error Probability Analysis for Convolutional Codes. Hard and Soft Decisions. Punctured Convolutional Codes and Rate-Compatible Schemes. Concatenated encoders, Recursive Systematic Convolutional Encoders. Turbo Encoders. Construction Methods for Turbo Codes: Interleavers, Code Concatenation Methods. Turbo Code Performance as a Function of Size and Type of Interleaver . Decoding of Turbo Codes: The Turbo Decoder, Probabilities and Estimate, Symbol Detection, The Log Likelihood Ratio. The BCJR MAP Algorithm and the LLR Calculation. Turbo Decoding Forward Error Correction and Automatic Repeat ReQuest. Forward Error Correction. Automatic Repeat ReQuest. ARQ Schemes. ARQ Scheme Efficiencies. Hybrid-ARQ Schemes. Low-Density Parity Check Codes. Description of LDPC Codes. Construction of LDPC Codes. Decoding of LDPC Codes: The Tanner Graph. Decoding Algorithm for LDPC Codes Packet oriented channel coding: Fountain and Raptor Codes. Estimation Theory. Minimum Mean Square Error estimator: general solution an Gaussian in Gaussian estimate. Minimum Mean Magnitude error estimator. Maximum Posterior Probability and Maximum Likelihood estimates. Cramèr Rao lower bound for monodimensional and multidimensional estimates. Efficient estimators. Fisher’s Information Matrix. Time of arrival estimation. Joint time and Doppler shift estimation. Wavelet transform and multiresolution analysis. Signal restoration based on wavelet transform shrinking.
(reference books)
A. Neri. " Appunti dalle lezioni di Teoria dell’Informazione e Codici ". Freely available at http://elearning.dia.uniroma3.it/moodle/. Additional books freely available in the e-book collection of ROMA TRE Biblioteca di Area Scientifico-tecnologica (http://host.uniroma3.it/biblioteche/page.php?page=collezion0) Elements of information theory Thomas M. Cover, Joy A. Thomas, 2. ed., 2006 John Wiley & Sons, Ltd.
Essentials of error-control coding Jorge Castiñeira Moreira, Patrick Guy Farrell, c2006 John Wiley & Sons, Ltd.
|