Bayesian statistical methodology is based on Bayes' Theorem:

P(A|B) = \frac{P(B | A)\, P(A)}{P(B)}\,\! .
  • P(A) is the prior probability of A. It is "prior" in the sense that it does not take into account any information about B.
  • P(A|B) is the conditional probability of A, given B. It is also called the posterior probability because it is derived from or depends upon the specified value of B.
  • P(B|A) is the conditional probability of B given A.
  • P(B) is the prior or marginal probability of B.
  • Generally, in Bayesian inference, A represents the model, while B represents the data.

There are two common schools of thought on Bayesian inference, the objectivist view and the subjectivist view. Objectivists view probability statements as an extension of logic. Subjectivists view them as measures of "personal belief."

External LinksEdit

Bayes Recommendations From The ChannelEdit

< gzl>

< papshmear> box/tiao, is okay for the mathematical background


< papshmear> An Introduction to Bayesian Inference and Decision by Winkler, is quite nice

< papshmear> Gregories,, is okay but he's a maxent fanboy too

< sevenless> DS Sivia's book:

< sevenless> Gelman 2003, Bayesian Data Analysis, is nice after Sivia

< sevenless> Gelman and Hill, 2007, on applied multilevel modeling and causal inference,

Ad blocker interference detected!

Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.