Bayesian Statistics (III/IV) (0530001) 10 credits

(46 - Dr Peter Lee, (no office), tel 01904 654200, e-mail pml1@york.ac.uk)

This is the version for the academic year beginning on 1 September of the year 2010.

Aims: To introduce the basic notions of Bayesian statistics, showing how Bayes Theorem provides a natural way of combining prior information with experimental data to arrive at a posterior probability distribution over parameters and to show the differences between classical (sampling theory) statistics and Bayesian statistics.

Learning objectives: By the end of the module the student should be able:

  1. To prove and use Bayes Theorem in its various forms;
  2. To carry out an analysis of normally distributed data with a normal prior distribution;
  3. To be able to carry out a significance test of a point (or sharp) null hypothesis using a full Bayesian methodology;

Prerequisites: Statistical Theory I (0590015).

Syllabus: The module will follow closely on parts of Chapters 1 to 4 of the main recommended textbook listed below.

  1. Preliminaries
    1. Probability and Bayes’ Theorem
    2. Examples on Bayes’ Theorem
    3. Random variables
    4. Several random variables
    5. Means and variances (Conditional Means and Variances omitted)
    6. Exercises on Chapter 1
  2. Bayesian Inference for the Normal Distribution
    1. Nature of Bayesian inference
    2. Normal prior and likelihood
    3. Several normal observations with a normal prior
    4. Dominant likelihoods
    5. Locally uniform priors
    6. Highest density regions (HDRs)
    7. Normal variance
    8. HDRs for the normal variance
    9. The role of sufficiency
    10. Conjugate prior distributions (Mixtures of conjugate densities omitted)
    11. The exponential family (Omitted)
    12. Normal mean and variance both unknown
    13. Conjugate joint prior for the normal distribution
    14. Exercises on Chapter 2
  3. Some Other Common Distributions
    1. The binomial distribution
    2. Reference prior for the binomial likelihood
    3. Jeffreys’ rule (Multidimensional case omitted)
    4. The Poisson distribution
    5. The uniform distribution (Omitted)
    6. Reference prior for the uniform distribution (Omitted)
    7. The tramcar problem (Omitted)
    8. The first digit problem; invariant priors (Omitted)
    9. The circular normal distribution (Omitted)
    10. Approximations based on the likelihood (Extension to more than one parameter omitted)
    11. Reference posterior distributions (Omitted)
    12. Exercises on Chapter 3
  4. Hypothesis testing
    1. Hypothesis testing
    2. One-sided hypothesis tests
    3. Lindley’s method
    4. Point null hypotheses with prior information (Case of nearly constant likelihood omitted)
    5. Point null hypotheses (normal case) (Case of an unknown variance omitted)
    6. The Doogian philosophy (Omitted)
    7. Exercises on Chapter 4
  5. Two sample problems (Omitted)
  6. Correlation, regression and analysis of variance (Omitted)
  7. Other topics (Omitted)
  8. Hierarchical models (Omitted)
  9. The Gibbs sampler and other numerical methods (Non-examinable)
    1. Introduction to Numerical Methods
    2. The EM Algorithm (Omitted apart from the subsection below)
      • Data augmentation
    3. Data augmentation by Monte Carlo
    4. The Gibbs sampler
    5. Rejection sampling (Omitted)
    6. The Metropolis-Hastings algorithm (Omitted)
    7. Introduction to WINBUGS (Omitted)
    8. Generalized linear models (Omitted)
    9. Exercises on Chapter 9

Appendices

References Index

Recommended texts:
*** P M Lee, Bayesian Statistics: An Introduction, Wiley (SF 4 LEE)
(For most of the material covered, either the second, the third or the fourth edition will be satisfactory)
.
*** A Gelman, J B Carlin, H S Stern and D B Rubin, Bayesian Data Analysis (2nd edn) Chapman & Hall (SF GEL)
*** H R Neave, Statistics Tables for mathematicians, engineers, economists and the behavioural and managementsciences,London: Routledge (SF 0.83 NEA and REF SF 0.83 NEA).
*     G R Iverson, Bayesian Statistical Inference, Beverley Hills, CA: Sage (SF4 IVE) (for preliminary reading).
*     S B McGrayne, The Theory That Would Not Die, New Haven, CT: Yale University Press 2011 (for very interesting background reading) (SF4 MCG).
*     J Albert, Bayesian Computation with R, Springer-Verlag (SF 4 ALB) (for computational aspects).
**   D V Lindley, An Introduction to Probability and Statistics from a Bayesian Viewpoint (2 vols - Part I: Probability and Part II: Inference), Cambridge University Press (S 9 LIN).

Teaching and Support Teaching: Spring Term
2 lectures per week
1 problems class per week

Assessment:
One and a half hour closed examination towards the end of the Summer Term 90%
Coursework 10%

Elective Information: A course on Bayesian statistics showing how to make inferences by combining prior beliefs with information obtained from experimental data.