Description
(a) By observing n samples (Xi,Yi),i = 1,…,n, the likelihood can be written as
By choosing prior on , we have the posterior as
(b) By choosing the given proposal distribution, the algorithm for generating samples from posterior is shown below. Note that in the following algorithm, θ denotes the target parameter that we hope to obtain from posterior, which is ρ. To avoid confusion, we use γ(θn,θ0) to denote the acceptance ratio (which is same as what Brani used ρ(θn,θ0) in the lecture video).
• Step 1. Start with arbitrary θ0.
• Step 2. At stage n + 1, generate proposal θ0 from uniform U(θn − 0.1,θn + 0.1).
• Step 3. Determine θn+1 as
– θn+1 = θ0 with probability γ(θn,θ0); – θn+1 = θn with probability 1 − γ(θn,θ0).
• Step 4. Set n = n + 1 and go to Step 2.
We now find the acceptance ratio γ(θn,θ0) as follows.
Based on the proposal distribution for θ, we know that
Similarly, we also have
.
Notice that θn − 0.1 ≤ θ0 ≤ θn + 0.1 and θ0 − 0.1 ≤ θn ≤ θ0 + 0.1 are indeed equivalent, and thus
.
Thus, the acceptance ratio can be computed as
.
By plugging in the values for and based on n = 100 observations, the
acceptance ratio can be computed as
(c) We attached the histogram of ρs and the realizations of the chain for the last 1000 simulations shown in Figure 1 and 2, respectively. The Bayes estimator can be computed from the mean of the ρs obtained. We have ρˆBayes = 0.7224.
(d) By replacing the proposal distribution, we plot the histogram of ρs and the realizations of the chain for the last 1000 simulations shown in Figure 3 and 4, respectively. The Bayes estimator of ρ based on proposal distribution U(−1,1) is computed as
ρˆBayes = 0.7227.
Figure 1: Histogram of ρs given proposal distribution U(ρi−1 − 0.1,ρi−1 + 0.1)
We notice that the Bayes estimator with proposal distribution U(−1,1) is similar to that with U(ρi−1− 0.1,ρi−1 + 0.1). The acceptance ratio with proposal distribution U(−1,1) is also lower (can be seen from the parallel segments from the behavior of the chain).
2 Gibbs Sampler and Lifetimes with Multiplicative Frailty.
(a)
Hence λ|µ,t1,..,tn respects to Gamma distribution with parameters n+c and
α. By symmetry, µ|λ,t1,..,tn respects to Gamma distribution with parameters n + d and .
(b) Gibbs sampler algorithm for sampling (λ,µ)|t1,…,tn is as follows:
• Step 1. Start with µ0 = 0.1.
Figure 2: Realizations of the chain for the last 1000 simulations given given proposal distribution U(ρi−1 − 0.1,ρi−1 + 0.1)
• Step 2. For m = 1,…,51000
– λm is a sample generated from
– µm is a sample generated from
• Step 3. Discard (λm,µm), m = 1,…,1000.
(c) Scatterplot of (λ,µ): Histograms of individual components λ and µ:
• Posterior means for λ = 0.0547 and for µ = 0.6911.
• Posterior variances for λ = 0.000358 and for µ = 0.06579.
• 95% equatailed credible sets for λ = (0.0255,0.0983) and for µ = (0.3185,1.3139).
Figure 3: Histogram of ρs given proposal distribution U(−1,1)
Figure 4: Realizations of the chain for the last 1000 simulations given given proposal distribution U(−1,1)
Figure 5: Scatter plot of (λ,µ)
Figure 6: Histograms of individual components λ and µ
A MATLAB code for problem 1
1
2 close all force
3 clear all
4
5 theta = 0; %initial values
6 thetas = [theta];
7
8 sum_x2 = 114.9707;
9 sum_y2 = 105.9196;
10 sum_xy = 82.5247;
11
12 n = 100;
13
14 tic
15 for i = 1:51000
16 %theta_prop = theta-0.1+2*0.1*rand();
17 theta_prop = -1+2*rand();
18 %————————————————————————–
19 if (theta≥-1 && theta≤1) 20 indicator_theta = 1;
21 else
22 indicator_theta = 0;
23 end
24 if (theta_prop≥-1 && theta_prop≤1)
25 indicator_theta_prop = 1;
26 else
27 indicator_theta_prop = 0;
28 end
29 %————————————————————————-30 rr = ((1-theta_prop^2)^(-(n+3)/2)* …
31 exp(-(sum_x2-2*theta_prop*sum_xy+sum_y2)/(2*(1-theta_prop^2))) * …
32 indicator_theta_prop)/((1-theta^2)^(-(n+3)/2)* …
33 exp(-(sum_x2-2*theta*sum_xy+sum_y2)/(2*(1-theta^2)))*indicator_theta);
34 %————————————————————————–
35 rho = min( rr ,1);
36 if (rand < rho)
37 theta = theta_prop;
38 end
39 thetas = [thetas theta];
40 end
41 toc
42 %Burn in 1000
43 thetas = thetas(1000:end);
44 figure(1)
45 hist(thetas, 50)
46 xlabel(‘theta’)
B R code for problem 2
1 alpha=100; beta=5
2 c=3; d=1
3 sumt=512
4 n=20
5 M=51000
6
7 mu=0.1
8 set.seed(1)
9 record = matrix(0, nrow=2, ncol=M)
10 for(m in 1:M){
11 lambda=rgamma(1, shape=(n+c), rate=(mu*sumt+alpha))
12 mu=rgamma(1, shape=(n+d), rate=(lambda*sumt+beta))
13 record[,m] = c(lambda, mu)
14 }
15 lambda=record[1,-(1:1000)]
16 mu=record[2,-(1:1000)]
17 ## scatterplot
18 par(mfrow=c(1, 1))
19 plot(lambda, mu, main=”scattor plot”, xlab =”lambda”, ylab=”mu”, cex=0.3)
20
21 ## histogram of lambda and mu
22 par(mfrow=c(1, 2))
23 hist(lambda, xlab=”lambda”, main=”histogram of lambda”, breaks=50) 24 hist(mu, xlab=”mu”, main=”histogram of mu”, breaks=50)
25
26 ## mean of lambda and mu
27 mean(lambda)
28 mean(mu)
29
30 var(lambda)
31 var(mu)
32
33 ## 95% equitailed credible set of lambda and mu
34 slambda=sort(lambda)
35 smu=sort(mu)
36 slambda[c(1250, 48750)]
37 smu[c(1250, 48750)]
38
39 ## mean of lambda*mu
40 mean(lambda*mu)
Reviews
There are no reviews yet.