BannerHauptseite TUMHauptseite LehrstuhlMathematik SchriftzugHauptseite LehrstuhlHauptseite Fakultät

Entropie und Informationstheorie [MA6001]

Sommersemester 2015

Prof. Dr. Robert König, Prof. Dr. Felix Krahmer

Dozent: Prof. Dr. Robert König, Prof. Dr. Felix Krahmer
Übungsleitung: Anna-Lena Hashagen
Seminar: Thursday 14.15-15.45, room 03.10.011 Anmeldung

Hauptseminar Entropie und Informationstheorie

Hauptseminar Entropie und Informationstheorie

Robert König robert.koenig@tum.de
Felix Krahmer felix.krahmer@tum.de
Anna-Lena Hashagen hashagen@ma.tum.de
Time & Location: Thursdays, 14.15-15.45, Seminarraum M5 (5610.03.011)
  first meeting: April 16, 2015
  (weekly, except May 14, May 21, June 4, July 16)

Guidelines

We recommend having a look at Prof. Dr. Manfred Lehn's website Pfeil, which gives advice on how to prepare a good seminar talk.

Topics

This seminar will serve as an introduction to basic concepts of information theory. Topics to be discussed include information-theoretic quantities (entropies) and corresponding inequalities, differential entropies and continuous variables, as well as operational problems such as Shannon's noisy channel coding, data compression and rate distortion theory.

You will be assigned one of the following topics. For each topic in the following list, we give a number of keywords as suggested concepts to be covered in your talk. However, it is up to you to make a reasonable selection of subtopics and choose which concepts to emphasize. Please try to include at least one full proof of a result.

  1. Review of basic probability theory (April 16): Discrete and continuous random variables, probability density, conditional distributions, Bayes rule, independence, expectation value and variance, Markov's inequality/Chebyshev's inequality, weak law of large numbers and tail bounds (Chernoff) for i.i.d. random variables. Definition of Markov chains. Leonardo Araujo

  2. Entropy, relative entropy and mutual information (April 23): basic definitions and properties. Venn-diagrammatic interpretation, chain rule, data processing inequality. Fano's inequality as an operational interpretation of conditional entropy. Sophia Koch

  3. Axiomatic approaches to entropies: Rényi entropy, Hartley entropy and Shannon entropy. Axioms and proof of uniqueness of Shannon entropy (see 1 Pfeil). Classical statistical mechanics and Boltzmann/Gibbs entropy.

  4. Stochastic processes (April 30): (in particular, Markov chains and random walks). Stationary distributions and convergence to stationarity. Entropy rates: different formulations. Entropy rates for Markov chains and hidden Markov chains. Anna Heinrich

  5. Asymptotic equipartition property (AEP) (May 7): Lossy data compression and Fano's inequality. AEP, typical sets, implications for data compression. Shannon's source coding theorem. Niklas Schmidt

  6. Data compression (May 28): Lossless compression: Kraft's inequality, bounds on optimal codes, Huffman codes. Nathanael Bosch

  7. Channel coding for discrete memoryless channels (June 11): examples of noisy channels (binary symmetric and binary erasure channel). Definition of rates and capacity formula (Shannon). Statement and (weak) converse to channel coding theorem. Melina Woitun

  8. Achievability of capacity for discrete memoryless channels (June 18): joint AEP, proof of achievability. Achievability by random linear codes for binary symmetric channel. Sebastian Scharl

  9. Concrete codes for communication (June 25): rate & code distance, linear codes, generator- and parity check matrix, Hamming bound, Reed-Solomon codes. Adrian Sieler

  10. Additive white Gaussian noise channel: differential entropy, AEP for continuous variables, achievability and converse.

  11. Types and universal source coding: Strong typicality, existence of a universal source code.

  12. Rate distortion theory: Problem statement, definition of rate distortion function, application to Bernoulli source, proof of converse.

  13. Introduction to quantum information theory (July 2): Jonathan Haas

Bibliography

1
J. Aczél, B. Forte and C.T. Ng, Why the Shannon and Hartley Entropies Are `Natural', Advances in Applied Probability, vol. 6, no. 1 (March 1974), pp. 131-146. Available at https://cs.uwaterloo.ca/research/tr/1973/CS-73-19.pdf Pfeil

2
Thomas M. Cover and Joy A. Thomas, Elements of Information Theory, Wiley-Interscience; 2 edition (2006).

3
David MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge University Press; First edition (2003). Available at http://www.inference.phy.cam.ac.uk/mackay/itila/book.html Pfeil.

4
Claude E. Shannon, A Mathematical Theory of Communication, Bell System Tech. J. 27, (1948). 379423, 623656. Available at http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf Pfeil.