You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: lectures/bayes_nonconj.md
+3-3Lines changed: 3 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ kernelspec:
13
13
14
14
# Non-Conjugate Priors
15
15
16
-
This lecture is a sequel to the QuantEcon lecture <https://python.quantecon.org/prob_meaning.html>
16
+
This lecture is a sequel to the {doc}`quantecon lecture <prob_meaning>`.
17
17
18
18
That lecture offers a Bayesian interpretation of probability in a setting in which the likelihood function and the prior distribution
19
19
over parameters just happened to form a **conjugate** pair in which
@@ -88,7 +88,7 @@ from numpyro.optim import Adam as nAdam
88
88
89
89
## Unleashing MCMC on a Binomial Likelihood
90
90
91
-
This lecture begins with the binomial example in the QuantEcon lecture <https://python.quantecon.org/prob_meaning.html>
91
+
This lecture begins with the binomial example in the {doc}`quantecon lecture <prob_meaning>`.
92
92
93
93
That lecture computed a posterior
94
94
@@ -103,7 +103,7 @@ We use both the packages `pyro` and `numpyro` with assistance from `jax` to app
103
103
104
104
We use several alternative prior distributions
105
105
106
-
We compare computed posteriors with ones associated with a conjugate prior as described in QuantEcon lecture https://python.quantecon.org/prob_meaning.html
106
+
We compare computed posteriors with ones associated with a conjugate prior as described in {doc}`the quantecon lecture <prob_meaning>`
Copy file name to clipboardExpand all lines: lectures/prob_meaning.md
+31Lines changed: 31 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -693,3 +693,34 @@ Thus, the Bayesian statististian comes to believe that $\theta$ is near $.4$.
693
693
As shown in the figure above, as the number of observations grows, the Bayesian coverage intervals (BCIs) become narrower and narrower around $0.4$.
694
694
695
695
However, if you take a closer look, you will find that the centers of the BCIs are not exactly $0.4$, due to the persistent influence of the prior distribution and the randomness of the simulation path.
696
+
697
+
698
+
## Role of a Conjugate Prior
699
+
700
+
We have made assumptions that link functional forms of our likelihood function and our prior in a way that has eased our calculations considerably.
701
+
702
+
In particular, our assumptions that the likelihood function is **binomial** and that the prior distribution is a **beta distribution** have the consequence that the posterior distribution implied by Bayes' Law is also a **beta distribution**.
703
+
704
+
So posterior and prior are both beta distributions, albeit ones with different parameters.
705
+
706
+
When a likelihood function and prior fit together like hand and glove in this way, we can say that the prior and posterior are **conjugate distributions**.
707
+
708
+
In this situation, we also sometimes say that we have **conjugate prior** for the likelihood function $\textrm{Prob}(X | \theta)$.
709
+
710
+
Typically, the functional form of the likelihood function determines the functional form of a **conjugate prior**.
711
+
712
+
A natural question to ask is why should a person's personal prior about a parameter $\theta$ be restricted to be described by a conjugate prior?
713
+
714
+
Why not some other functional form that more sincerely describes the person's beliefs.
715
+
716
+
To be argumentative, one could ask, why should the form of the likelihood function have *anything* to say about my
717
+
personal beliefs about $\theta$?
718
+
719
+
A dignified response to that question is, well, it shouldn't, but if you want to compute a posterior easily you'll just be happier if your prior is conjugate to your likelihood.
720
+
721
+
Otherwise, your posterior won't have a convenient analytical form and you'll be in the situation of wanting to
722
+
apply the Markov chain Monte Carlo techniques deployed in {doc}`this quantecon lecture <bayes_nonconj>`.
723
+
724
+
We also apply these powerful methods to approximating Bayesian posteriors for non-conjugate priors in
725
+
{doc}`this quantecon lecture <ar1_bayes>` and {doc}`this quantecon lecture <ar1_turningpts>`
0 commit comments