Welcome to the new unit in Machine Learning, this unit will be given for the first time during the academic year 2017/18.

Machine Learning is the science of how we can build abstractions of the world from data. In this unit we will start with the fundamental underlying principles and philosophies that allows us to learn and then look at how we can formulate these using explicit models. We will then look at how we can fit these models to data in order to "learn".

Machine learning is mathematical in nature and a good grasp of linear algebra and multi-variate calculus is recommended in order to fully digest the material. There is lots of good material on the internets that you can brush up your skill with.

You can download all the material for the course by checking out the following GitHub repo. As the unit is running for the first time, some things will change during the term so please keep up-to-date with the repo. I will try to add links to this page as much as possible but these might be out of date quickly so it is much better that you keep up to date with the repository.


We have two lectures of 1h per week, Monday 15-16 and Tuesday 16-17 both in the Queens Building lecture theatre 1.40. Following the lecture on Tuesday we will have 1h of currated lecture where you tell me what I should talk about. The way this will work is that we have set-up a reddit feed where you can post questions and suggestions. I will then in classic reddit fashion pick topics from the top of the feed. The feed is private at the moment, but we will accept anyone so you can be completely anonomous while posting. This is a bit of an experiment that I hope will work, so I hope that you can help me out with this. Also, please be active on the feed and answer each others questions.

  1. Introduction (w1)
  2. Basic Probabilities (w1)
  3. Distributions and further Probabilities (w2)
  4. Linear Regression (w2)
  5. Dual Linear Regression (w3)
  6. Gaussian Processes (w3)
  7. Gaussian Processes II and Unsupervised Learning (w4)
  8. Bayesian Optimisation (w4)
  9. Dirichlet Processes (w5)
  10. Topic Models (w5)
  11. Graphical Models (w6)
  12. Inference I: Laplace Approximation (w6)
  13. Inference II: Stochastic Methods (w7)
  14. Inference III: Variational Methods (w7)
  15. Neural Networks (w9)
  16. (TBC) (w9)
  17. Summary (w10)
  18. Dr. Tom Dieth Amazon Research (w11)
  19. Dr. Tom Dieth Amazon Research (w11)


Each week we have a one hour lab session. During the labs you will work on the coursework (more on this later). You are free to use whatever programming language that you want for the unit. But if you have an interest to continue working in machine learning I strongly recommend that you get used to working with Python as it is quickly become the standard language for ML. To help people kick their matlab habit we will during the first lab session have a crash-course in Python.

The unit is assessed through two pieces of coursework and a written exam. The coursework should be done in groups of two to four and are both submitted in terms of a report. The first assignment is worth 35% of the mark while the second is worth 15%. In addition to submitting a report you should also submit your code. If you for one reason or the other want to work alone on the coursework, come and talk to me first and we will see what we can do about this. I will mark the coursework independently of group size.


The first coursework will cover the material from week 1 to the beginning of week 4 and is mostly focused on how to reason about probabilities and how to build models of data. We will see the strength of priors and look at different ways of incorporating assumptions.

You can download the description of the coursework here. The deadline for this will be on the 27th of October in terms of a report that you will submit on SAFE.


The second coursework will focus on how we can perform approximative inference in models where it is intractable, computationally or analytically to compute the marginal distribution in closed for. You can download the coursework here.

\[ p(y) = \int p(y,x) \textrm{d}\mathbf{x} \]

This coursework will be released in the beginning of week 7 and the deadline will be on the 1st of December in terms of a report that you will submit on SAFE.


We will use Bishop, C. M., Pattern recognition and machine learning (2006) for most parts of this unit. The book is an excellent introduction to machine learning and one of the few that provides a consistent and rigourous narrative rather than falling into the trap and becomming a cookbook for practitioners. As the book is rather old we will for a few of the topics use other material that is freely availible.

Reading List

Lecture Bishop Other Material
2 1.1, 1.2.1-4, 1.3, 1.4  
3 2-2.3.5, 2.3.9, (2.4)  
4 3.1, 3.3-3.6  
5 6-6.2  
6 6.4-6.4.2  
7 6.4.3-6.4.4, 12.2-12.2.1  
11 8.1-3  
12 4.3-4.4  
13 11.1.0-2,11.1.4, 11.2.0,11.3  
14 10.1, 10.3  
16 5-5.3