The course provides an introduction to the basics of machine learning from a probabilistic perspective. The aim is to allow the student to develop the ability to design inference and learning models and methods in a Bayesian framework. The course begins with a review of probability, mathematics, and optimisation, followed by a discussion of the most common probabilistic models for discrete and continuous data and then models and methods for sequences. The main techniques of exact and approximate inference using a representation based on graphical models are presented below, including, among others, MCMC and variational inference methods. The course ends with the application of the above to deep generative models.