Algorithmic Information Theory
Algorithmic information theory uses optimal data compression to root the con-cept of information in computation rather than probability. Arising as the mer-gence of Turing’s theory of computation and Shannon’s theory of information,algorithmic information involves many theoretical concepts like the Church-Turing thesis, prefix codes, Kolmogorov complexity, algorithmic randomness,Solomonoff’s induction and logical depth. It is also rich in applications, noto-riously by clearly expressing the incompleteness of mathematics and detachingstatistics and model selection from probabilities. Even more concrete are appli-cations to machine learning, like classification, risk analysis in generalized dataand anomaly detection. All the abovementioned concepts shall be investigatedto varying degrees of precision throughout the course.
The key aims of the course are:
- to develop creative thought and technical abilities to solve problems;
- to convey an algorithmic-based view of statistics and data analysis in whichrandomness is defined in terms of computation and
- to appreciate the philosophical consequences of such a strong theory of in-formation, notably by modern expressions of mathematical incompleteness.
This course explores the theory and applications of algorithmic informationvia both magistral presentations (∼10 classes) and participatory seminars (∼4classes).
Problem sheets to hand in (50%) & presentation of a research article (50%).