“the explicit use of external evidence in the design, monitoring, analysis, interpretation and reporting of a (scientific investigation)” (Spiegelhalter, 2004)
A Bayesian is one who, vaguely expecting to see a horse and catching a glimpse of a donkey, strongly concludes he has seen a mule. (Senn, 1997)
“You shall know them by their posteriors”
derivation \[ Pr[A \cap B] = Pr[B \cap A] \\ Pr[A \cap B] = Pr[A|B] Pr[B]\\ Pr[B \cap A] = Pr[B|A] Pr[A]\\ Pr[A|B] Pr[B] = Pr[B|A] Pr[A]\\ Pr[A|B] = \frac{Pr[B|A] Pr[A]}{Pr[B]}\\ Pr[B|A] = \frac{Pr[A|B] Pr[B]}{Pr[A]} \]
by law of total probability, \(Pr[A] = Pr[A|B] Pr[B] + Pr[A|\overline{B}] Pr[\overline{B}]\)
so denominator, \[ Pr[B|A] = \frac{Pr[A|B] Pr[B]}{Pr[A|B] Pr[B] + Pr[A|\overline{B}] Pr[\overline{B}]}\\ Pr[B|A] = \frac{Pr[A|B] Pr[B]}{\sum Pr[A|B] Pr[B]} \]
for continuous probability distributions, \[ Pr[B|A] = \frac{Pr[A|B] Pr[B]}{\int dB Pr[A|B]Pr[B]} \]
or (n terms of parameters and data) \[ Pr[\theta|y] \frac{Pr[y|\theta] Pr[\theta]}{\int dB Pr[y|\theta]Pr[\theta]} \]
Western Blot + | Western Blot - | |
---|---|---|
ELISA + | 498 | 4 |
ELSIA - | 10 | 488 |
In R: \(((498/502)*(502/1000))/(((498/502)*(502/1000)) + ((10/498)*(498/1000)))\) = 98%
positive predictive value \[ Pr[D|+] = \frac{Pr[+|D]P[D]}{Pr[+|D]P[D] + Pr[+|\overline{D}]P[\overline{D}]}\\ = \frac{(498/508)(508/1000)}{(498/508)(508/1000) + (4/488)(492/1000) } \]
99%.
Western Blot + | Western Blot - | |
---|---|---|
ELISA + | 1960 | 9944 |
ELSIA - | 40 | 990056 |
positive predictive value \[ \frac{(1960/2000)(2000/1000000)}{(1960/2000)(2000/1000000) + (784/998000)(998000/1000000)} \]
a frequentist is someone who only uses Baye’s Theorem sometimes
the Rev. Bayes says \[ posterior \; \propto \; prior \; * \; likelihood \\ p(\theta | y) \propto p(\theta)*p(y | \theta) \]
1 2 3 4 5 6 |
|
1 2 |
|
1 2 3 4 5 6 7 |
|
\[ \sim Nl(\mu, \sigma^2) \]
\[ \sim Unif(a,b), \\ \mu = \frac{(a+b)}{2}, \\ \sigma^2 = \frac{(b-a)^2}{12} \]
\[ \sim Exp(\lambda),\\ mu=1/\lambda,\\ \sigma^2=1/\lambda^2 \]
\[ \sim \Gamma(\alpha, \beta),\\ \mu=\frac{\alpha}{\beta} \\ \sigma^2=\frac{\alpha}{\beta^2} \]
\[ \sim Beta(\alpha,\beta)\\ \mu=\frac{\alpha}{\alpha+\beta}\\ \sigma^2=\frac{\alpha \beta}{(\alpha \beta)^2 (\alpha + \beta) + 1} \]
\[ \sim Dirichlet(\alpha_1, ... \alpha_k) \]
Likelihood | Prior | Posterior |
---|---|---|
Normal | Normal | Normal |
Binomial | Beta | Beta |
Poisson | Gamma | Gamma |
conjugate solution with our prior \[ \sim Beta(1+k, 1+n-k) \]
conjugacy binomial and beta \[ Beta(a+k,b+n-k) \]
in R, sample the beta distribution * x<-rbeta(1000, 12,28) plot(density(x)) summary(x)
plan a trial of 20 patients what is the probability that 15 will respond?
monte carlo simulation in BUGS or JAGS
1 2 3 4 5 6 |
|
1 2 3 4 5 6 7 8 9 |
|
prior still Beta (9.2, 13.8)
binomial probability of 15 successes in 20 trials
algorithms to evaluate posterior given (almost) any prior and likelihood
write the model in code form
write the model code.
1 2 3 4 5 6 7 8 9 |
|
1 2 3 4 5 6 |
|
1 |
|
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
|
1 2 3 4 5 6 7 |
|
Poisson likelihood \[ P(k) = e^{-\lambda} * \lambda^k / k \]
\(\mu = \sigma^2 = \lambda\)
Hits (x) | 0 | 1 | 2 | 3 | 4 | 7 |
Areas (n) | 229 | 211 | 93 | 35 | 7 | 1 |
mean \(\bar{x}=537/576=0.93\)
posterior \[ p(\theta | y) = \Gamma(a+n\bar{x}, b+n) = \Gamma(537.5, 576) \\ \mu = 537.5/ 576 = 0.933 \\ \sigma^2 = 537.5/ 576^2 = 0.0016 \]
as sample size increases, posterior mean approaches MLE mean, posterior s.d. approaches MLE s.d.
model as Poisson process of counts
Bayesian approach can be informative.
posterior \[ \Gamma(\alpha + y_t, \beta + n_t ) = \Gamma(\Sigma y_c + y_t, \Sigma n_c + n_t ) \]
1 2 3 4 5 6 7 8 9 10 11 12 13 |
|
1 2 |
|
1 2 3 4 5 6 7 8 9 10 11 12 |
|