Statistics for Artificial Intelligence I - MAT00126M
- Department: Mathematics
- Credit value: 20 credits
- Credit level: M
- Academic year of delivery: 2025-26
Module summary
This module will introduce students to important ideas, topics and techniques in Artificial Intelligence, as well as their implementation.
Related modules
Post-requisite modules
Statistics for Artificial Intelligence II
Pre-requisite knowledge for MSc students:
—preliminary ideas of probability distributions;
—preliminary concepts of linear algebra (matrix manipulations).
Module will run
| Occurrence | Teaching period |
|---|---|
| A | Semester 1 2025-26 |
Module aims
This module will introduce students to important ideas, topics and techniques in Artificial Intelligence, as well as their implementation.
Module learning outcomes
By the end of the course, students will have a good grounding in the main statistical concepts underlying basic Neural Networks (NNs); Bayesian inference; probabilistic techniques for mechanistic supervised learning. Specifically, after this module, students will be able to:
-
undertake NN-based learning of simple functions that represent the relationship between an output and a set of inputs (in R);
-
appreciate the founding of NNs in generalised linear models, and undertake linear modelling and logistic modelling of real-world datasets (in R);
-
undertake Rejection Sampling to generate random numbers from various probability distributions;
-
undertake simple Markov Chain Monte Carlo based inference in R, for the learning of model parameters given real-world datasets;
-
appreciate the effect of priors and proposals in Bayesian inference.
Module content
-
Linear Models in Data Science - formulation of a linear model; target and input variables; closed-form estimation of regression coefficients; goodness of fit given the data; examples in R.
-
Generalised Linear Models (GLMs) in Data Science - motivation and formulation of a GLM; link functions; linear predictor; logistic model; examples in R.
-
Introduction to Neural Networks (NNs)- perceptron; architectural details; training and testing.
-
Bayesian inference - Bayes rule; posterior, likelihood, prior; what Bayesian inference is; Maximum a-Posteriori and Markov Chain Monte Carlo (MCMC).
-
Rejection Sampling - relevance in real-world learning; motivation and formal algorithm; implementation and development of code, with examples in R.
-
MCMC - motivation and algorithm; development of code and R implementation of Random Walk Metropolis Hastings (RWMH); Independent Sampler (IS) MCMC; learning uncertainties with MCMC.
Indicative assessment
| Task | % of module mark |
|---|---|
| Closed/in-person Exam (Centrally scheduled) | 100.0 |
Special assessment rules
None
Indicative reassessment
| Task | % of module mark |
|---|---|
| Closed/in-person Exam (Centrally scheduled) | 100.0 |
Module feedback
Current Department policy on feedback is available in the student handbook. Coursework or examination will be marked and returned in accordance with this policy.
Indicative reading
-
Dalia Chakrabarty (2024), Supervised Learning - Mathematical Foundations & Real-World Applications, Taylor & Francis.
-
Lantz, B. (2019). Machine Learning with R: Expert Techniques for Predictive Modelling. PACKT.
-
Derek Young, (2018), Handbook of Regression Methods, CRC Press.
-
James, G., Witten, D., Hastie, T., Tibshirani, R. (2018). An introduction to Statistical Learning. Springer.