## Mathematical Foundations of Machine Learning [MA4801]

### Summer 2020

### News

- Lectures and exercise classes will take place online. Video lectures and lecture notes will be provided. Further material and information can be found on the Moodle page.

### Content

The course will provide an introduction to the mathematical foundations of learning theory and neural networks. If time allows, we will also look into kernel methods.

Date/video |
content |

20.04 ^{} |
unboxing MA4801 |

21.04 ^{} |
probabilistic framework for supervised learning |

27.04 ^{} |
PAC bounds, growth function, VC-dimension |

04.05 ^{} |
Computing the VC-dimension |

11.05 ^{} |
Rademacher complexities |

18.05 ^{} |
Algorithmic Stability |

25.05 ^{} |
Sample compression |

08.06 ^{} |
Ensemble methods |

15.06 ^{} |
Neural networks & their memorization capacity |

22.06 ^{} |
Approximations via shallow networks |

30.06 ^{} |
VC-dimension of neural networks |

07.07 ^{} |
Deep neural networks |

14.07 ^{} |
Training neural networks |

21.07 ^{} |
Kernel methods |

### Lecture notes

Lecture notes will be updated once a week (usually on Tue).

### Prerequisites

Basic knowledge in linear algebra, analysis and probability theory is required. For the discussion of kernel methods, we will need some elementary Hilbert space theory.

### Literature

There are many good books on the topic. Recent examples with a focus on mathematical aspects are:

- Foundations of Machine Learning, M. Mohri, A. Rostamizadeh, A. Talwalkar, MIT Press, 2012
- Understanding Machine Learning: From Theory to Algorithms, Shai Shalev-Shwartz, Shai Ben-David, Cambridge University Press, 2014

Among the classic books with a focus on mathematical results are:

- Neural Network Learning: Theoretical Foundations, M. Anthony, P.L. Bartlett, Cambridge University Press, 1999
- Statistical Learning Theory, V.N. Vapnik, John Wiley & Sons, 1998