Syllabus
Unit I
Basics on minimaxity: subjective and frequents probability, Bayesian inference, Bayesian estimation , prior distributions, posterior distribution, loss function, principle of minimum expected posterior loss, quadratic and other common loss functions, Advantages of being a Bayesian HPD confidence intervals, testing, credible intervals, prediction of a future observation.
Unit II
Bayesian analysis with subjective prior, robustness and sensitivity, classes of priors, conjugate class, neighborhood class , density ratio class different methods of objective priors : Jeffrey’s prior, probability matching prior, conjugate priors and mixtures, posterior robustness: measures and techniques.
Unit III
Model selection and hypothesis testing based on objective probabilities and Bayes’ factors, large sample methods: limit of posterior distribution, consistency of posterior distribution, asymptotic normality of posterior distribution.
Unit IV
Bayesian Computations : analytic approximation, E- M Algorithm, Monte Carlo sampling, Markov Chain Monte Carlo Methods, Metropolis – Hastings Algorithm, Gibbs sampling, examples, convergence issues
Course Objectives and Outcomes
CO1: To extend understanding of the practice of statistical inference.
CO2: To familiarize the student with the Bayesian approach to inference.
CO3:To describe computational implementation of Bayesian analyses.
CO4: Use Bayesian computational software, e.g. R, for realistically complex problems and interpret the results in context.