Introduction to Bayesian Learning
People
Course director
Description
Topics that will be covered include: the Bayesian learning paradigm: prior and posterior distributions, Bayesian point estimation, credible intervals, hypothesis testing and linear regression. Bayesian Approaches in Machine Learning with specific emphasis on Bayesian optimization, Bayesian Deep Learning and Bayesian AB testing.
On the more computational aspects we will cover: Monte Carlo integration, Markov chains, Markov chain Monte Carlo (MCMC) methods, Adaptive MCMC, MCMC convergence diagnostics, Approximate Bayesian Computation.
Prerequisites. Introduction to statistical inference: notion of population, sample, point estimator, confidence interval, hypothesis testing, linear regression. Having taken the course “Introduction to Data Science” or a similar course covering the basics of Statistical inference is very beneficial.
Objectives
The students will understand the main differences between the Bayesian and the frequentist approach to statistical learning.
Will be able to estimate a Bayesian model and to provide corresponding uncertainty quantifications. They will also learn how some Machine Learning algorithms can be embedded and/or interpreted in a Bayesian framework.
Teaching mode
In presence
Learning methods
Weekly lectures will be complemented with tutorials and practicals (with R / Python notebooks)
Examination information
A final project worth 100% of the grade. The final project comprises a final report and a PPT (to be turned in together with a notebook and the data when available) to be present in front of the class. Q&A will follow.
Class participation will be also considered towards the final grade.
Education
- Master of Science in Computational Science, Lecture, 1st year
- Master of Science in Computational Science, Lecture, Elective, 2nd year
- PhD programme of the Faculty of Informatics, Lecture, Elective, 1st year (2.0 ECTS)