Bayesian Statistics

An Introduction

Fourth Edition

PETER M. LEE

Preface to the Fourth Edition

When I started writing this book in 1987 it never occurred to me that it would still be of interest nearly a quarter of a century later, but it appears that it is, and I am delighted to introduce a fourth edition. The subject moves ever onwards, with increasing emphasis on Monte-Carlo based techniques. With this in mind, Chapter 9 entitled “The Gibbs sampler” has been considerably extended (including more numerical examples and treatments of OpenBUGS, R2WinBUGS and R2OpenBUGS) and a new Chapter 10 covering Bayesian importance sampling, variational Bayes, ABC (Approximate Bayesian Computation) and RJMCMC (Reversible Jump Markov Chain Monte Carlo) has been added. Mistakes and misprints in the third edition have been corrected and minor alterations made throughout.

The basic idea of using Bayesian methods has become more and more popular, and a useful accessible account for the layman has been written by McGrayne (2011). There is every reason to believe that an approach to statistics which I began teaching in 1985 with some misgivings because of its unfashionability will continue to gain adherents. The fact is that the Bayesian approach produces results in a comprehensible form and with modern computational methods produces them quickly and easily.

Useful comments for which I am grateful were received from John Burkett, Stephen Connor, Jacco Thijssen, Bo Wang and others; they, of course have no responsibility for any deficiencies in the end result.

The website associated with the book

http://www.york.ac.uk/depts/maths/histstat/pml1/bayes/book.htm

(note that in the above pml are letters followed by the digit 1), works through all the numerical examples in R as well as giving solutions to all the exercises in the book (and some further exercises to which the solutions are not given)

Peter M. Lee

19 December 2011
Click on the required option below: