BannerHauptseite TUMHauptseite LehrstuhlMathematik SchriftzugHauptseite LehrstuhlHauptseite Fakultät

Entropie und Informationstheorie [MA601185]

Wintersemester 2013/14

Prof. Dr. Michael M. Wolf

Dozent: Prof. Dr. Michael M. Wolf
Time and location: Monday 16:15-17:45, room 03.10.011 Anmeldung
Zentralübung:   Anmeldung
Tutorübungen:   Anmeldung



Through this course we learn the basics of classical information theory. The list of topics is:
  1. Basic entropy & Information inequalities, Ms Lambacher,02 December.
  2. Stochastic processes, data processing & entropy rates, Ms Schleibner, 09 December.
  3. Data compression & Shannon's source coding theorem, Mr Koller, 16 December.
  4. Shannon's noisy channel coding theorem, Mr Bader, 13 January.
  5. Axiomatic approaches to entropic quantities, Mr Lindlacher, 20 January.
  6. Von Neumann entropy and Quantum data compression, not assigned.


  1. Each presentation should be 60-70 minutes.
  2. Presentations should be given on the blackboard. However, one can use a projector for specific reasons, for example, showing pictures or graphs.
  3. The aim of presentations is to teach other students.
  4. A summary (2 pages) of the assigned topic should be submitted which contains, for example, definitions and references.
  5. One should have (at least) two meetings with Dr Fukuda before the presentation. The first meeting is for deciding what to do, and the second for checking if one is ready for the talk.


  1. Thomas M. Cover and Joy A. Thomas, "Elements of Information Theory", Wiley-Interscience; 2 edition (2006).
  2. David MacKay, "Information Theory, Inference, and Learning Algorithms",Cambridge University Press; First Edition edition (2003), FREE HERE Pfeil.
  3. Claude E. Shannon, "A Mathematical Theory of Communication", Bell System Tech. J. 27, (1948). 379–423, 623–656. PDF file. Pfeil