Theme: Bayesian Inference & Data Assimilation
Course Title: Introduction to Bayesian Computational Methods via Markov Chain Monte Carlo Algorithms

Lecturer: Dr Chris Drovandi

Course Content
Statistical inferences in a Bayesian framework are obtained through the posterior distribution, which quantifies the uncertainty in model parameters based on information from observed data and prior knowledge. To make these inferences it is often necessary to generate samples from the posterior distribution, but this can generally not be done perfectly. One of the foundational methods, still one of the most popular today, for approximate sampling from the posterior is Markov chain Monte Carlo (MCMC) methods. The main idea of MCMC is to construct a Markov chain with the posterior as its limiting distribution. This course will describe the concepts underpinning the validity of MCMC samplers and popular MCMC algorithms for exploration of the posterior. The course will also describe how to process the output of MCMC and to diagnose convergence. The computer lab will involve developing an MCMC algorithm for a case study with a given model and dataset. The MCMC method will be implemented in MATLAB.

Background Reading

  • An undergraduate course in statistical inference (the concept of a likelihood function and parameter estimation)
  • An introductory knowledge of Bayesian statistics would also be desirable
Course Title: Data Assimilation: A Mathematical Introduction

Lecturer: Dr Kody Law

Course Content
These lectures will provide a systematic treatment of the mathematical underpinnings of work in data assimilation, covering both theoretical and computational approaches. Specifically we will develop a unified mathematical framework in which a Bayesian formulation of the problem provides the bedrock for the derivation, development and analysis of algorithms. Explicit calculations, numerical examples, and exercises and matlab code will be provided in order to illustrate the theory. The lectures will also include an introduction to some state-of-the-art algorithms.

Background Reading