MLPR 2016 | Notes | Log | Forum | Tutorials | Assignments | Feedback
MLPR tutorials, Autumn 2016
This is an archive of a previous version of the course.The 2017/18 tutorial page is here.
For those sitting the course for credit, the tutorials are compulsory. You meet with a tutor in a group to discuss the week’s exercises, and/or anything else the group decides to cover. You should do the exercise sheet before your tutorial! If you have any difficulties, that is fine. Come to the tutorial being able to explain where you got stuck. Even better, discuss with others in the class first, and be able to explain what a few of you think. Your tutorial group won't necessarily discuss everything on the sheet: full answers will be made available, so the point isn't to rush through all the answers.
The tutorial sheets will be made available here.
- Tutorial 1, week 3, html, pdf.
- Tutorial 2, week 4, html, pdf.
- Tutorial 3, week 5, html, pdf.
- Tutorial 4, week 6, html, pdf.
- Tutorial 5, week 7, html, pdf.
- Tutorial 6, week 8, html, pdf.
- Tutorial 7, week 9, html, pdf.
Last year's sheets were different, as the course has changed in content, credit weighting, and order.
Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
MLPR 2017 | Notes | Log | Forum | Tutorials | Assignments | Feedback
MLPR class notes
This set of notes was new last year, and I am still actively trying to improve them. I will respond to your comments and questions, and fix or expand parts if and when necessary. However, effort from you is also required. Pleasesign up to the forum, and ask questions.
You can step through the HTML version of these notes using the left and right arrow keys.
Each note links to a PDF version for better printing. However, if possible, please annotate the HTML versions of the notes in the forum, to keep the class's comments together. If the HTML notes don't render well for you, I suggest trying in Chrome/Chromium. If you want quick access to the PDFs from this page, you can toggle the pdf links.
A rough indication of the schedule is given, although we won’t follow it exactly.
- w0a – Course administration, html, pdf.
- w0b – Books useful for MLPR, html, pdf.
- w0c – MLPR background self-test, html, pdf. Answers: html, pdf.
- w0d – Maths background for MLPR, html, pdf.
- w0e – Programming in Matlab/Octave or Python, html, pdf.
- w0f – Expectations and sums of variables, html, pdf.
- w1a – Course Introduction, html, pdf.
- w1b – Linear regression, html, pdf.
- w1c – Linear regression, overfitting, and regularization, html, pdf.
- w2a – Training, Testing, and Evaluating Different Models, html, pdf.
- w2b – Univariate Gaussians, html, pdf. Answers: html, pdf.
- w2c – The Central Limit Theorem (CLT), html, pdf. Answers: html, pdf.
- w2d – Error bars, html, pdf.
- w2e – Multivariate Gaussians, html, pdf.
- w3a – Classification: Regression, Gaussians, and pre-processing, html, pdf.
- w3b – Regression and Gradients, html, pdf.
- w3c – Logistic Regression, html, pdf.
- w4a – Softmax and robust regressions, html, pdf.
- w4b – Neural networks introduction, html, pdf.
- w4c – More on fitting neural networks, html, pdf.
- w6a – Autoencoders and Principal Components Analysis (PCA), html, pdf.
- w6b – Netflix Prize, html, pdf.
- w6c – Bayesian regression, html, pdf.
- w7a – Bayesian inference and prediction, html, pdf.
- w7b – Bayesian model choice, html, pdf.
- A Bayesian linear regression demo: matlab/octave, python
- w7c – Gaussian processes, html, pdf.
- A minimal GP demo: matlab/octave, python
- Alternative GP demo: matlab/octave, python
- w8a – Gaussian Processes and Kernels, html, pdf.
- w8b – Bayesian logistic regression and Laplace approximations, html, pdf.
- w8c – Computing logistic regression predictions, html, pdf.
- w10a – Sparsity and L1 regularization, html, pdf.
- w10b – More on optimization, html, pdf.
- w10c – Ensembles and model combination, html, pdf.
A coarse overview of major topics covered is below. Some principles aren't taught alone as they're useful in multiple contexts, such as gradient-based optimization, different regularization methods, ethics, and practical choices such as feature engineering or numerical implementation.
- Linear regression and ML introduction
- Evaluating and choosing methods from the zoo of possibilities
- Multivariate Gaussians
- Classification, generative and discriminative models
- Neural Networks
- Learning low-dimensional representations
- Bayesian machine learning: linear regression, Gaussian processes and kernels
- Approximate Inference: Bayesian logistic regression, Laplace, Variational
- Gaussian mixture models
- Time allowing: Other principles: sparsity/L1, ensembles: combination vs averaging.
You are encouraged to write your own outlines and summaries of the course. Aim to make connections between topics, and imagine trying to explain to someone else what the main concepts of the course are.
MLPR 2017 | Notes | Forum | Tutorials | Assignments | FAQ | Feedback