AcademyProbability

Academy

Bayes Theorem

Level 1 - Math II (Physics) topic page in Probability.

Principle

Bayes theorem reverses conditional probability. It updates a prior belief about a hypothesis after an observation is made.

Notation

\(H\)
hypothesis event
\(B\)
observed evidence event
\(P(H)\)
prior probability of the hypothesis before observing B
\(P(B|H)\)
likelihood: probability of the evidence if H is true
\(P(B)\)
evidence probability: total probability of observing B
\(P(H|B)\)
posterior probability of H after observing B
\(E_1,\ldots,E_n\)
partition events: exactly one of these cases occurs

Method

Derive Bayes theorem

Use the multiplication rule one way
\[P(H\cap B)=P(H|B)P(B)\]
Use the multiplication rule the other way
\[P(H\cap B)=P(B|H)P(H)\]
Equate the same intersection
\[P(H|B)P(B)=P(B|H)P(H)\]
Divide by the evidence probability
\[P(H|B)=\frac{P(B|H)P(H)}{P(B)}\]

Use a partition for the denominator

If \(E_1,\ldots,E_n\) are exhaustive and non-overlapping cases, then the evidence can happen through any one of them.

Split the evidence
\[B=(B\cap E_1)\cup\cdots\cup(B\cap E_n)\]
Add disjoint pieces
\[P(B)=P(B\cap E_1)+\cdots+P(B\cap E_n)\]
Use multiplication rule
\[P(B)=P(B|E_1)P(E_1)+\cdots+P(B|E_n)P(E_n)\]
Summation form
\[P(B)=\sum_{i=1}^n P(B|E_i)P(E_i)\]

Rules

Bayes theorem
\[P(H|B)=\frac{P(B|H)P(H)}{P(B)}\]
Evidence from a partition
\[P(B)=\sum_{i=1}^n P(B|E_i)P(E_i)\]
Bayes with partition denominator
\[P(E_j|B)=\frac{P(B|E_j)P(E_j)}{\sum_{i=1}^n P(B|E_i)P(E_i)}\]

Examples

Question
A disease affects 1 percent of people. A test is positive with probability 0.99 if diseased and 0.05 if not diseased. Find the probability of disease given a positive test.
Answer
Let \(D\) be disease and \(T\) be positive. The denominator includes true positives and false positives:
\[P(D|T)=\frac{0.99\cdot0.01}{0.99\cdot0.01+0.05\cdot0.99}=\frac{0.0099}{0.0594}=\frac{1}{6}.\]

Checks

  • The denominator must include all ways the observation can happen.
  • A high likelihood does not guarantee a high posterior when the prior is small.
  • Posterior probabilities across all partition hypotheses should add to \(1\).
  • Bayes theorem reverses conditioning; it does not say \(P(H|B)=P(B|H)\).