Machine Learning & Probabilistic Graphical Models - COM00032H

« Back to module search

  • Department: Computer Science
  • Module co-ordinator: Dr. James Cussens
  • Credit value: 10 credits
  • Credit level: H
  • Academic year of delivery: 2018-19

Module summary

This module presents a number of methods for machine learning (ML) where patterns are extracted from data to make predictions about future data and/or to understand the process that generated the data. It also presents "probabilistic graphical models" (PGM) which use graphs to represent probabilistic relations between variables (e.g. those representing observed data and those representing future data). The two topics are connected in the following way: many machine learning methods can be usefully represented using a PGM.

Professional requirements

None

Module occurrences

Occurrence Teaching cycle
A Spring Term 2018-19 to Summer Term 2018-19

Module aims

This module aims to introduce a number of key machine learning methods and to connect them, where appropriate, to probabilistic graphical models. A key goal is to connect learning from data to reasoning with uncertainty using probability theory.

Module learning outcomes

On completion of this module, students will:

  1. Understand how probabilistic graphical models represent relations of conditional independence between variables
  2. Understand the assumption behind using the following methods for data analysis: PCA, LDA, Kernel methods, SVMs, naive Bayes
  3. Know when it is appropriate to use the following methods for data analysis: PCA, LDA, Kernel methods, SVMs, naive Bayes
  4. Know how to model complex machine learning problems using the BUGS software
  5. Understand the EM algorithm and its efficient implementation for a hidden Markov model

Module content

  • Bayesian networks
  • Naive Bayes
  • Markov networks
  • Markov chain Monte Carlo (MCMC) using BUGS
  • Hidden Markov models
  • The Expectation-Maximisation (EM) algorithm
  • Principal Components Analysis (PCA)
  • Linear Discriminant Analysis (LDA)
  • Kernels
  • Support Vector Machines

Assessment

Task Length % of module mark
Essay/coursework
Essay: Open Assessment
N/A 100

Special assessment rules

None

Reassessment

Task Length % of module mark
Essay/coursework
Essay: Open Assessment
N/A 100

Module feedback

  1. Students will have a lightweight formative open assessment half way through the module for which they will get feedback.
  2. Students will get feedback on their work in the summative open assessment.

Indicative reading

  • David Barber, Bayesian Reasoning and Machine Learning, Cambridge University Press, 2012
  • Hastie, T., Tibshirani, R., and R. Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition., Springer, 2009

 

 



The information on this page is indicative of the module that is currently on offer. The University is constantly exploring ways to enhance and improve its degree programmes and therefore reserves the right to make variations to the content and method of delivery of modules, and to discontinue modules, if such action is reasonably considered to be necessary by the University. Where appropriate, the University will notify and consult with affected students in advance about any changes that are required in line with the University's policy on the Approval of Modifications to Existing Taught Programmes of Study.