Introduction to Data Science
In inductive practice we are interested to learn about the state of the world given some event, i.e., the data. In this course we will learn about ``estimation'' procedures, in particular maximum likelihood and the method of moments, and some of their theoretical properties. We also learn about hypothesis testing. Then we apply both estimation and testing to a practical setting: linear regression analysis. The course will be offered online as well to also allow double-degree Master students to enroll.
- apply estimation procedures;
- derive properties, such as bias, consistency, sufficiency, efficiency, of estimation procedures;
- derive and apply maximum likelihood, method-of-moments;
- show the proof for the Cramer-Rao lower bound and the Asymptotic efficiency of MLE;
- understand the Bayesian paradigm and apply Bayesian computational approaches;
- apply hypothesis testing and derive its properties, including the Neyman-Pearson theorem and the asymptotic distribution of the likelihood ratio statistic;
- apply estimation and testing principles to linear and logistic regression.
Combination of lectures and tutorials
Final written exam
- Master of Science in Computational Science, Lecture, 1st year