Course Catalog 2011-2012
Basic

Basic Pori International Postgraduate Open University

|Degrees|     |Study blocks|     |Courses|    

Course Catalog 2011-2012

MAT-51706 Bayesian Methods, 6 cr

Additional information

Pre-recorded lectures in English are available on-line; contact teaching consists entirely of tutorial sessions.
Suitable for postgraduate studies

Person responsible

Robert Piche

Lessons

Study type P1 P2 P3 P4 Summer Implementations Lecture times and places
Excercises
 2 h/week
+2 h/week

 

 

 
MAT-51706 2011-01  

Requirements

Exam, or weekly exercises + exam.
Completion parts must belong to the same implementation

Principles and baselines related to teaching and learning

-

Learning outcomes

Bayesian statistics is an alternative approach to classical statistics. It is a coherent data analysis methodology that is based on the systematic application of the laws of probability. Since the discovery of powerful computer algorithms, starting in the 1990's, Bayesian methods are now widely used in all areas of science and engineering, including machine learning, medical imaging, and data compression. After studying this course, the student is able to formulate statistical models for inference, hypothesis testing, model comparison, and decisions. Given a data set, he/she can apply formulas for the solution of simple standard models, and can write WinBUGS programs and MCMC algorithms for the solution of more realistic models. The prerequisite for the course is knowledge of elementary probability theory: probability calculus; pdf's and pmf's; mean and variance; change of variable; the normal, binomial, Poisson, and exponential distributions. Knowledge of "classical" statistics is not needed.

Content

Content Core content Complementary knowledge Specialist knowledge
1. Formulation of standard parameter-inference problems: finding the mean of real data (normal model), the proportion of count data (binomial model, multinomial model), the rate of occurrences (Poisson model), or the lifetime (exponential model). Derivation of the exact solution (i.e. the parameters' posterior distribution and data's posterior predictive distribution) using conjugate priors. Writing of WinBUGS and DoodleBUGS programs to compute the solution numerically. Computation of summary statistics (mean, mode, median, credibility intervals) and one-sided hypotheses  Recursive use of Bayes' rule; prior predictive distribution; beta, gamma, inverse-gamma, Student-t, and Dirichlet distributions; Poisson model for occurrences in intervals of different sizes; coping with censored lifetime data; derivation of formula for mean and variance for beta and gamma distributions  Proofs of probability calculus theorems from axioms; eliciting subjective probability from betting odds; negative binomial and beta-binomial distributions; derivation of Poisson density; hazard and reliability functions; derivation of posterior marginal distribution for two-parameter normal model 
2. Formulation of more complex inference problems: gaussian mixture for coping with outliers; comparing means; hierarchical model for comparing groups; fitting a regression curve; fitting an autoregressive model; detecting a change in rate. Writing of WinBUGS programs and Gibbs sampler algorithms for their solution  comparing means in models with paired observations; Markov chains and stationary distribution  the joint density corresponding to a directed acyclic graph (DAG); proof that Gibbs sampler's stationary distribution is the posterior 
3. Bayesian theory: Jeffreys' prior; stopping rule principle; Laplace's method for approximation of posterior; model comparison via Bayes factor and via the Deviance Information Criterion (DIC)   consistency of Jeffreys' prior with change of variables; the likelihood principle; applications of exact marginalisation: detecting a change point, finding the frequency of a periodic signal, finding the autoregression parameter, choosing the regularisation parameter  ancillary data, sufficient statistics 
4. Finding the decision that minimizes the expected loss and finding the Bayesian decision function for problems with a finite number of discrete variables and options  prior value of data; signal detection; decision-theoretical interpretation of mean, mode, and median  derivation of mean, mode, and median 

Evaluation criteria for the course

The course grade is based on a three-hour open-book exam written in a PC-lab that allows access to WinBUGS. Bonus points (up to 20%) can by earned by presenting solutions to the weekly homework problems.

Assessment scale:

Numerical evaluation scale (1-5) will be used on the course

Partial passing:

Completion parts must belong to the same implementation

Study material

Type Name Author ISBN URL Edition, availability, ... Examination material Language
Other online content   Home page   Robert Piche       Recorded lectures (flash), exercises (pdf)      English  
Online book   Bayesian Methods (pdf)   Robert Piché and Antti Penttinen            English  

Prerequisite relations (Requires logging in to POP)



Correspondence of content

Course Corresponds course  Description 
MAT-51706 Bayesian Methods, 6 cr MAT-51700 Bayesian Analysis, 6 cr  

More precise information per implementation

Implementation Description Methods of instruction Implementation
MAT-51706 2011-01 Recorded lectures are available for self-study on the course home page. In the weekly tutorials, students present their solutions to exercise problems.        

Last modified09.03.2011