Entropie und Informationstheorie [MA6001]
Sommersemester 2015
Prof. Dr. Robert König, Prof. Dr. Felix Krahmer
Dozent: | Prof. Dr. Robert König, Prof. Dr. Felix Krahmer | |
Übungsleitung: | Anna-Lena Hashagen | |
Seminar: | Thursday 14.15-15.45, room 03.10.011 | Anmeldung |
Hauptseminar Entropie und Informationstheorie
Robert König | robert.koenig@tum.de |
Felix Krahmer | felix.krahmer@tum.de |
Anna-Lena Hashagen | hashagen@ma.tum.de |
Time & Location: | Thursdays, 14.15-15.45, Seminarraum M5 (5610.03.011) |
first meeting: April 16, 2015 | |
(weekly, except May 14, May 21, June 4, July 16) |
Guidelines
- Your presentation should ideally last at most 60 minutes. It may be given in English or German.
- Presentations should be given on the blackboard but you may use a projector for showing, e.g., pictures or graphs. The target audience are the other students attending the seminar.
- Please submit a summary (at most 2 pages) of the assigned topic at least one week before your talk. This should contain definitions, examples and references. These summaries will be put on the webpage.
- You can arrange a meeting with Anna-Lena Hashagen. This is intended to help identify key concepts to be presented, address specific technical questions, and to make sure your presentation is ready for public consumption.
Topics
This seminar will serve as an introduction to basic concepts of information theory. Topics to be discussed include information-theoretic quantities (entropies) and corresponding inequalities, differential entropies and continuous variables, as well as operational problems such as Shannon's noisy channel coding, data compression and rate distortion theory.
You will be assigned one of the following topics. For each topic in the following list, we give a number of keywords as suggested concepts to be covered in your talk. However, it is up to you to make a reasonable selection of subtopics and choose which concepts to emphasize. Please try to include at least one full proof of a result.
- Review of basic probability theory (April 16):
Discrete and continuous random variables, probability density, conditional distributions, Bayes rule, independence, expectation value and variance, Markov's inequality/Chebyshev's inequality, weak law of large numbers and tail bounds (Chernoff) for i.i.d. random variables. Definition of Markov chains. Leonardo Araujo
- Entropy, relative entropy and mutual information (April 23): basic definitions and properties. Venn-diagrammatic interpretation, chain rule, data processing inequality. Fano's inequality as an operational interpretation of conditional entropy. Sophia Koch
- Axiomatic approaches to entropies: Rényi entropy, Hartley entropy and Shannon entropy. Axioms and proof of uniqueness of Shannon entropy (see 1 ^{}). Classical statistical mechanics and Boltzmann/Gibbs entropy.
- Stochastic processes (April 30): (in particular, Markov chains and random walks). Stationary distributions and convergence to stationarity. Entropy rates: different formulations. Entropy rates for Markov chains and hidden Markov chains. Anna Heinrich
- Asymptotic equipartition property (AEP) (May 7): Lossy data compression and Fano's inequality. AEP, typical sets, implications for data compression. Shannon's source coding theorem. Niklas Schmidt
- Data compression (May 28): Lossless compression: Kraft's inequality, bounds on optimal codes, Huffman codes. Nathanael Bosch
- Channel coding for discrete memoryless channels (June 11): examples of noisy channels (binary symmetric and binary erasure channel). Definition of rates and capacity formula (Shannon). Statement and (weak) converse to channel coding theorem. Melina Woitun
- Achievability of capacity for discrete memoryless channels (June 18): joint AEP, proof of achievability. Achievability by random linear codes for binary symmetric channel. Sebastian Scharl
- Concrete codes for communication (June 25): rate & code distance, linear codes, generator- and parity check matrix, Hamming bound, Reed-Solomon codes. Adrian Sieler
- Additive white Gaussian noise channel: differential entropy, AEP for continuous variables, achievability and converse.
- Types and universal source coding:
Strong typicality, existence of a universal source code.
- Rate distortion theory: Problem statement, definition of rate distortion function, application to Bernoulli source, proof of converse.
- Introduction to quantum information theory (July 2): Jonathan Haas
Bibliography
- 1
- J. Aczél, B. Forte and C.T. Ng, Why the Shannon and Hartley Entropies Are `Natural', Advances in Applied Probability, vol. 6, no. 1 (March 1974), pp. 131-146. Available at https://cs.uwaterloo.ca/research/tr/1973/CS-73-19.pdf ^{}
- 2
- Thomas M. Cover and Joy A. Thomas, Elements of Information Theory, Wiley-Interscience; 2 edition (2006).
- 3
- David MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge University Press; First edition (2003). Available at http://www.inference.phy.cam.ac.uk/mackay/itila/book.html ^{}.
- 4
- Claude E. Shannon, A Mathematical Theory of Communication, Bell System Tech. J. 27, (1948). 379423, 623656.
Available at http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf ^{}.