Exercise 2.1 [OLD]
  [TA] Di Su    Created at: 0000-00-00 00:00   2022A2Ex1 0 
Exercise 2.1 (${\color{red}\star}{\color{gray}\star}{\color{gray}\star}$ - Different types of priors (50%)). Suppose that a new virus that causes a pandemic was recently discovered. A new vaccine against the new virus was just developed. The aim of this exercise is to estimate the vaccine's efficacy \(\theta\), which is defined as \[\begin{aligned} \theta &= \frac{\pi_0 - \pi_1}{\pi_1}, \end{aligned}\] where \(\pi_0\) and \(\pi_1\) are the attack rates of unvaccinated and vaccinated humans, respectively. In a preclinical stage, the vaccine was tested on laboratory animals (not humans). The following data were obtained.
Unvaccinated animalsVaccinated animals
Infected animals143
Uninfected animals117
Now, the vaccine is tested on healthy humans. Suppose there are \(n+m\) people. Among them, \(m\) people are randomly assigned to the control group (\(j=0\)), and the rest to the treatment group \((j=1)\). Denote \[x_{i}^{(j)} = \mathbb{1}(\text{the $i$th person in group $j$ is infected})\] for each \(i\) and \(j\). Let \[[ x_1^{(0)}, \ldots, x_m^{(0)} \mid \pi_0, \pi_1 ] \overset{ \text{iid}}{\sim} \text{Bern}(\pi_0) \qquad \text{and} \qquad [ x_1^{(1)}, \ldots, x_n^{(1)} \mid \pi_0, \pi_1 ] \overset{ \text{iid}}{\sim} \text{Bern}(\pi_1)\] be two independent samples given \(\pi_0, \pi_1\).
(40%) Suggest, with a brief explanation (\(\lesssim\) 20 words each) or mathematical derivation,
a conjugate prior on \((\pi_0, \pi_1)\),
an informative prior on \((\pi_0, \pi_1)\),
a non-informative prior on \((\pi_0, \pi_1)\), and
a weakly-informative prior on \((\pi_0, \pi_1)\).
(10%) Emma believes that vaccinating must be better than doing nothing. So, she uses the following prior \(f(\pi_0, \pi_1) \propto \mathbb{1}(\pi_0\geq \pi_1).\) Comment. (Use \(\lesssim 50\) words.)
Hints:
You may define a prior for \((\pi_0,\pi_1)\) so that \(\pi_0\) and \(\pi_1\) are independent.
Using the independence of \(\pi_0\) and \(\pi_1\), you can derive the conjugate prior individually.
This part illustrate the power of Bayesian analysis. Data based on humans and animals are likely to be generated from different mechanisms, however, it is still natural to approximate one mechanism by the other. So, you may construct prior based on the animals dataset to reflect your belief. Some possible methods are suggested below:
For \(j=0,1\), construct a conjugate prior for \(\pi_j\) with hyper-parameters selected according to their interpretation in Example 2.13 of the lecture note.
For \(j=0,1\), define \(\pi_j \sim \text{Beta}(\alpha^{(j)},\beta^{(j)})\), where \(\alpha^{(j)}\) and \(\beta^{(j)}\) are selected so that \[\begin{aligned} \mathsf{E}(\pi_j) = \widehat{\pi}_j + (\text{Bias} ) \qquad \text{and} \qquad {\mathsf{Var}}(\pi_j) = \widehat{\sigma}^2_j \times (\text{Inflation factor}), \end{aligned}\] where \(\widehat{\pi}_j\) is a frequentist estimator of \(\pi_j\), and \(\widehat{\sigma}^2_j\) is a frequentist estimator of \({\mathsf{Var}}(\widehat{\pi}_j)\) as a proxy of \({\mathsf{Var}}(\pi_j)\). If you firmly believe your location estimate, then you may set the bias to be zero and set the inflation factor to one. You may use the fact that if \(\xi \sim \text{Beta}(\alpha, \beta)\) satisfy \(\mathsf{E}(\xi)=\mu\) and \({\mathsf{Var}}(\xi)=\sigma^2\) then \[\alpha = \frac{\mu^2(1-\mu)}{\sigma^2}-\mu \qquad \text{and} \qquad \beta=\alpha\left(\frac{1}{\mu}-1 \right).\]
You may use Jeffereys prior on each parameter.
Is there any hardly denying information on \(\pi_0\) and \(\pi_1\)?
(Open-ended) Is Emma's belief (on the support of \((\pi_0,\pi_1)\)) too strong? Will it affect the posterior analysis? All reasonable answers are acceptable.
2.6
Easy Difficult    Number of votes: 5
Confusion about the first question
 Anonymous Orangutan    Last Modified: 2022-02-08 23:59  0 
I am confusing the first question.
You may define a prior for$(\pi_0, \pi_1)$ so that$\pi_0$ and$\pi_1$ are independent.
  • Using the independence of$\pi_0$ and$\pi_1$, you can derive the conjugate prior individually
At this moment,
I don't know how can I derive the answer ex
  1. $f(\pi_0, \pi_1) \sim \text{Beta}(\cdot)$or
  2. $f(\pi_0) \sim \text{Gamma}(\alpha_0)/\beta_0$ and $f(\pi_1) \sim \text{Gamma}(\alpha_1)/\beta_1$
    I think the correct answer is 1, but if so, what is the meaning of independent and Individually?
Show 2 reply
  [Instructor] Kin Wai (Keith) Chan    Created at: 2022-02-08 23:00  4 
It is a good question. “Independence” means probabilistic independence. “Individually” means that we define prior for each parameter individually one by one.
Indeed, both methods mentioned by you are correct:
  • (Method 1: Joint PDF) Let $(\pi_0, \pi_1)$ be RVs with
    \[
    f(\pi_0, \pi_1) = f(\pi_0) f(\pi_1) .
    \]
    Note that the joint PDF is the product of two marginal PDFs by independence.
  • (Method 2: Words) Let
    \[
    \pi_0 \sim (\text{Some distribution})\qquad \text{and} \qquad \pi_1 \sim (\text{Some distribution})
    \]
    be independent.
One example is shown below. Suppose $\theta_1$ and $\theta_2$ are two the mean parameters of two normal samples. Then we may define prior for $(\theta_1,\theta_2)$ as follows.
  • (Method 1: Joint PDF) Let $f(\theta_1, \theta_2) = \texttt{dnorm}(\theta_1, 0, 1) \times \texttt{dnorm}(\theta_2, 2, 3)$, where $\texttt{dnorm}(\cdot, \mu, \sigma)$ is the PDF of $\text{Normal}(\mu, \sigma^2)$.
  • (Method 2: Words) Let $\theta_1 \sim \text{Normal}(0,1)$ and $\theta_2 \sim \text{Normal}(2,3^2)$ be independent.
Note that both methods above define the same prior.
  [TA] Di Su    Created at: 2022-02-08 23:53  4 
Thanks for your question, and thanks for Keith's detailed reply! Here are some of my thoughts to your question.
Firstly, the independence of two random variables $\pi_0$ and $\pi_1$ means that
$$f(\pi_0,\pi_1)=f(\pi_0)f(\pi_1),$$
that is, the joint density of $\pi_0$ and $\pi_1$ can be factorized as a product of the marginal distribution of $\pi_0$ and the marginal distribution of $\pi_1$.
Secondly, “derive the conjugate prior individually” means to derive the prior of $\pi_0$ and the prior of $\pi_1$ separately so that $\pi_0$ and $[\pi_0\mid x_{1:m}^{(0)},x_{1:n}^{(1)}]$ are conjugate, moreover, $\pi_1$ and $[\pi_1\mid x_{1:m}^{(0)},x_{1:n}^{(1)}]$ are also conjugate.
Thirdly, the Beta distribution that we have learned in the lecture is univariate. Can we assign it as a prior for $(\pi_0,\pi_1)$ which consists of two random variables?