Machine Learning II - Statistical ML (4V+2U, 8.0 LP)
|SWS||Type||Course Form||CP (Effort)||Presence-Time / Self-Study|
|-||K||Lecture with exercise classes (V/U)||8.0 CP||156 h|
|2||U||Exercise class (in small groups)||28 h|
|(4V+2U)||8.0 CP||84 h||156 h|
|CP, Effort||8.0 CP = 240 h|
|Position of the semester||1 Sem. in WiSe|
|Level|| Master (General)|
|Area of study||[INF-KI] Intelligent Systems|
- Latex slides and blackboard writing
- Active learning approaches in class
- Slides and blackboard pictures as download (PDF)
Details on the admission requirements for the final exam will be announced in the lecture.
Possible Study achievement
- Verification of study performance: proof of successful participation in the exercise classes (ungraded)
- Details of the examination (type, duration, criteria) will be announced at the beginning of the course.
- Introduction and Overview
- Statistical machine learning
- Frequentist approach
- Bayesian approach
- Discriminative vs. generative Models
- Graphical models and topic models
- Gaussian processes
- Deep generative models: VAEs and GANs
- Reinforcement Learning
- Selected advanced topics
- Bishop, C. "Pattern Recognition and Machine Learning (Information Science and Statistics), 1st edn. 2006. corr. 2nd printing edn." Springer, New York(2007).
- Richard Sutton and Andrew Barto. Reinforcement Learning. Second Edition. MIT Press, 2018.
- Jerome Friedman, Trevor Hastie, and Robert Tibshirani. The elements of statistical learning. Vol. 1. Springer, Berlin: Springer series in statistics, 2001.
Requirements for attendance (informal)
Machine Learning II (short: ML2) directly builds upon Machine Learning I (short: ML1). Thus ML1 is a mandatory requirement and prerequisite to ML2 for all CS students. Mathematics students that did not take ML1 can take 'Statistical Learning and Selected Applications' instead (or in addition to) ML1 as prerequirement. Note that 'Statistical Learning and Selected Applications' and ML2 (and actually also ML3) can all be taken simultaneously.
We require for ML2 the same mathematical machinery that was also mandatory for ML1: linear algebra and calculus. However, in ML2 a new mathematical formalism enters the play: we will work with probabilities all over the place. We avoided probabilities in ML1, but now we will need to deal with it big time in ML2. Thus we do require the following mathematical knowledge and skills:
elementary probability calculation (expected value, covariance matrices, PDF, CDF, common distributions, calculation rules, etc.)
Kaiserslautern CS BSc students studied this material in their undergrad course 'Combinatorics, Probability and Statistics', but typically find it very challenging to apply their knowledge in the context of statistical machine learning, which is why many students might consider Machine Learning 2 as one of the most difficult courses in the CS master program. International CS students typically lack the necessary preliminaries. Both groups, but international students more so, are advised to fresh up their knowledge on probability and computation of integrals. An overview of much of the required math is given in Goodfellow et al.: Deep Learning, Section 3.
2. Optional requirements (nice to have):
- more about continuous optimization (second-order methods, condition number, convergence rates, basics of stochastic optimization)
- integration- and measure theory
- common distributions (beta, gamma, Dirichlet, normal, and conjugated priors)
- integration and measure theory (change of variables in multivariate integration)
- Inverting block matrices (Woodbury inverse)
For more information on the mathematical preliminaries of the ML courses, see: