Simple Bayesian Analysis — Technical Guide (SVG Equations)
All mathematical expressions below are rendered as SVG images for perfect rendering in WordPress without plugins.
The Core Idea
Bayesian analysis updates a prior belief using new evidence to produce a posterior belief. The engine is Bayes’ theorem:
- H: hypothesis or parameter
- D: observed data
- P(H): prior probability
- P(D\mid H): likelihood
- P(D): evidence (normalizer)
- P(H\mid D): posterior probability
One-Shot Example: Diagnostic Test
Setup. Disease prevalence 1%; sensitivity 99%; specificity 95% (false-positive rate 5%).
- P(H) = 0.01
- P(Positive | H) = 0.99
- P(Positive | ¬H) = 0.05
Evidence (law of total probability):
Posterior (positive predictive value):
Even with a high-accuracy test, the posterior is ~16.6% because the disease is rare (the prior dominates).
Conjugate Priors (Closed-Form Updates)
Beta–Binomial (probabilities)
For a Bernoulli/Binomial likelihood with success probability θ and prior Beta(α, β), after observing s successes and f failures:
Gamma–Poisson (rates)
For counts Y ~ Poisson(λ) with prior λ ~ Gamma(k, θ) (shape–rate), given total count Σy over n exposure units:
Worked Beta–Binomial Update (Landing-Page Conversion)
Prior. Mean ≈ 5% with moderate spread → choose Beta(2, 38) (mean 2/40).
Data. s = 18 conversions in n = 300 visitors (f = 282).
The posterior mean shifts from 5.0% to ≈5.9%, reflecting modest evidence for a better rate while honoring prior knowledge.
When to Use Simple Bayesian Analysis
- Low-data regimes (startups, rare events)
- Sequential decisions with continual updates
- Risk-sensitive contexts that need uncertainty quantification
- A/B tests with principled early stopping via posterior probabilities
Common Pitfalls (and Mitigations)
- Priors too strong or uninformed → run prior sensitivity analyses.
- Mis-specified likelihood → validate assumptions; consider robust models.
- Over-reliance on MAP → report full posterior summaries and credible intervals.
- No model checking → use posterior predictive checks.
Implementation Notes
- Use conjugate families (Beta–Binomial, Gamma–Poisson) for fast closed-form updates.
- For non-conjugate or hierarchical models, sample the posterior (e.g., PyMC/Stan) and compute credible intervals.
- Report posterior mean/median and a 95% credible interval; include posterior predictive diagnostics.
Further Reading
Summary Table
| Concept | Technical Summary (SVG where relevant) |
|---|---|
| Bayes’ Theorem |
|
| Evidence |
|
| Beta–Binomial Posterior |
|
| Gamma–Poisson Posterior |
|
| Posterior Mean (Beta) |
|
| MAP (Beta) |
|
| Diagnostic PPV |
|
