## Information Theory [MA5103]

### Wintersemester 2012/13

### Prof. Dr. Michael M. Wolf

Dozent: |
Prof. Dr. Michael M. Wolf | |

Übungsleitung: |
||

Mitwirkende: |
||

Vorlesung: |
Fr. 08:30 - 10:00, Raum 00.07.011 | Anmeldung |

Zentralübung: |
Mo. 14:15 - 15:45, Raum 00.07.011 | Anmeldung |

Tutorübungen: |
Anmeldung |

### News

- No exercise on Mon Feb4. The solution can be found here.
- Examination on Friday February 15, in MI HS3, 14:00-15:00. Note: literature and notes of all kinds will not be allowed during the examination. Here is an example of an examination from the last semesters.

file | version | remarks |
---|---|---|

Homework 1 | 19.10.12 | |

Homework 2 | 14.11.12 | typos corrected. |

Homework 3 | 14.11.12 | |

Homework 4 | 30.11.12 | The capacity formula is added. |

Homework 5 | 14.12.12 | |

Homework 6 | 21.01.13 | errors corrected. |

Homework 7 | 25.01.13 |

### Content

The course introduces the basics of classical information theory. This includes entropies and their properties, the asymptotic equipartition property, data compression, coding theory, channel capacities and an information theoretic perspective on statistics and data analysis.file | version | content |
---|---|---|

Lecture 1 | 19.10.12 | Intro, preliminaries on prob.theory, convexity and entropic quantities |

Lecture 2 | 26.10.12 | Entropy inequalities, mutual info, conditional entropy, Venn diagrams, perfect crypto systems, chain rules |

Lecture 3 | 05.11.12 | Data processing inequality, Fano's inequality, entropy rates |

Lecture 4 | 09.11.12 | Data compression, symbol codes, code trees for prefix codes, Kraft's inequality, entropy (rate) bounds |

Lecture 5 | 16.11.12 | Huffman codes, arithmetic coding, Lempel-Ziv coding |

Lecture 6 | 23.11.12 | lossy compression, AEP, typical sets, Shannon's source coding theorem |

Lecture 7 | 30.11.12 | discrete memoryless channels, joint AEP, rates, capacities |

Lecture 8 | 7.12.12 | Shannon's noisy channel coding theorem |

Lecture 9 | 14.12.12 | Properties of the capacity, codes with feedback |

Lecture 10 | 21.12.12 | source-channel separation, error correcting codes |

Lecture 11 | 11.01.13 | linear codes, Hamming bound, Gilbert-Varshamov bound |

Lecture 12 | 18.01.13 | Singleton bound, MDS codes, Reed-Solomon codes |

Lecture 13 | 28.01.13 | Decoding Reed-Solomon codes, a reference is: R.J. McEliece "The Decoding of Reed-Solomon Code" ^{} |

Lecture 14 | 1.02.13 | Signal recovery via L1 and L2 uncertainty relations, Logan's phenomenon |

### Literature

- Thomas M. Cover and Joy A. Thomas, "Elements of Information Theory", Wiley-Interscience; 2 edition (2006).
- David MacKay, "Information Theory, Inference, and Learning Algorithms",Cambridge University Press; First Edition edition (2003), FREE HERE
^{}. - Claude E. Shannon, "A Mathematical Theory of Communication", Bell System Tech. J. 27, (1948). 379–423, 623–656. PDF file.
^{}