Introduction to Data Science
At the end of the course, the student is able to: - derive and apply maximum likelihood, method-of-moments and Bayesian estimation; - derive properties, such as bias, consistency, sufficiency, efficiency, of estimation procedures; - show the proof for the Cramer-Rao lower bound and the Asymptotic efficiency of MLE; - apply Bayesian computational approaches; - apply hypothesis testing and derive its properties; - apply estimation and testing principles to linear regression. - extend linear regression to logistic regression via IRWLS.
In inductive practice we are interested to learn about the state of the world given some event, i.e., the data. In this course we will learn about estimation procedures, in particular maximum likelihood and the method of moments, and some of their theoretical properties. We also learn about hypothesis testing. Then we apply both estimation and testing to a practical setting: linear regression analysis.
Weekly lectures will be complemented with tutorials and practicals.
A final exam, worth 70% of the final grade, is accompanied by 3 assignments throughout the course, each worth 10%.
- Statistical Inference, Casella and Berger, Duxbury, 2th edition.