Bayesian inference is one of two dominant approaches to statistical inference. The word "Bayesian" refers to the influence of Reverend Thomas Bayes, who introduced what is now known as Bayes' theorem. Bayesian inference was developed prior to what is incorreclty called classical statistics, which is more appropriately referred to as frequentist inference. Bayesian inference is a modern revival of the classical definition of probability, associated with Pierre-Simon Laplace, in contrast to the frequentist definition of probability, most often associated with R. A. Fisher.
Introduction to Bayesian Inference
The following topics are intended to be read in succession for an introduction to Bayesian inference, though hopefully links between sections allow the reader to explore according to their interests. Enjoy.
- Bayes' Theorem
- Prior Probabilities
- Classes of Prior Distributions
- Hierarchical Bayes
- Numerical Approximation
- Bayes Factors
- Model Fit
- Posterior Predictive Checks
- Advantages of Bayesian Inference over Frequentist Inference
- Advantages of Frequentist Inference over Bayesian Inference
Additional Bayesian topics and information may be found with the site index. More information will be included in the future, so check back often.