We treat the basic notions of discrete probability theory: Probability spaces, the probability function, random variables, expectation value, variance and covariance. Of central importance are the limit theorems such as the weak law of large numbers and the central limit theorem. These establish the link to statistics: In order to obtain a significant statement from a random sample, what is the necessary size of the sample? We discuss estimators and tests. Also based of the notions of probability, we discuss the basics of information theory such as entropy, conditional entropy, and mutual information. We briefly look into the central results of coding theory, such as source and channel coding. These latter are of paramount importance, e.g., for data compression as well as (wireless) communication.