ISYE6420 – 1 Emily, Car, Stock Market, Sweepstakes, Vacation and Bayes. Solved

$ 20.99
Category:

Description

Based on the problem statement, we have the Bayesian networks shown in Figure 1 and the corresponding probabilities shown in Table 1.

Figure 1: The DAG of the Bayesian networks
Table 1: The known (or elicited) conditional probabilities
Table 2: Market Condition
Market
(M) Mc M
0.5 0.5
Table 3: Emily’s grade
Grade
(G) A B ≤ C
0.6 0.3 0.1

Table 4: Car condition Table 5: Probability of Emily went to Reding-
Car
(C) 0 1 M G
0.5 0.5 Mc GA
0.7 0.3 Mc GB
0.9 0.1 Mc GC
0.2 0.8 M GA
0.5 0.5 M GB
0.8 0.2 M GC
Redington Shores
(R) 0 1 L C
0.8 0.2 Lc Cc
0.3 0.7 Lc C
0.01 0.99 L Cc
0.01 0.99 L C
ton Shores
We let T∗ denote either event T or its complement T∗. Because of Markovian property, the joint probability P(M∗,G∗,C∗,L∗,R∗) can be factorized as
P(M∗,G∗,C∗,L∗,R∗) = P(M∗)P(G∗)P(C∗|M∗,G∗)P(L∗)P(R∗|C∗,L∗).
We find P(C|R),P(L|R),P(GB|R) and P(Mc|R) for (a), (b), (c) and (d), respectively. We show the exact calculation for (a) as an example, the detailed computation for (b),(c) and (d) is shown in Appendix A.
(a) We have
P(C,R) = X P(M∗,G∗,C,L∗,R)
M∗,G∗,L∗
= P(M)P(G)P(C|M,G)P(L)P(R|C,L)
+ P(M)P(G)P(C|M,G)P(Lc)P(R|C,Lc)
+ P(M)P(Gc)P(C|M,Gc)P(L)P(R|C,L)
+ P(M)P(Gc)P(C|M,Gc)P(Lc)P(R|C,Lc)
+ P(Mc)P(G)P(C|Mc,G)P(L)P(R|C,L)
+ P(Mc)P(G)P(C|Mc,G)P(Lc)P(R|C,Lc)
+ P(Mc)P(Gc)P(C|Mc,Gc)P(L)P(R|C,L) + P(Mc)P(Gc)P(C|Mc,Gc)P(Lc)P(R|C,Lc) = 0.3677.
By same argument, we have P(R) = 0.4630. Hence, we have
.
(b)
.
(c)
.
(d)
.
Code to the approach of direct simulation using Matlab is shown in Appendix B, and the code to the approach of using OpenBUGS is shown in Appendix C.
2 Trials until Fourth Success.
We first find the posterior distribution based on a general beta prior, namely Beta(a,b), and n observations.
Let X denote the number of failures until the r-th success in a trail, we know that X follows negative binomial distribution. That is X has pdf as
.
We substitute r = 4 in the above pdf, and obtain . Thus we find the likelihood as
.
As prior p ∼ Beta(a,b), we have
π(p) ∝ pa−1(1 − p)b−1.
Then we find the posterior distribution as

Thus the posterior follows . As we have n = 11 and , the posterior follows Beta(a + 44,b + 30).
(a) With a = b = 1, the posterior is Beta(45,31). The Bayes estimator of p is
.
The 95% credible set of p is found as [0.4804, 0.6992] based on the following Matlab command.
> betainv(0.025, 45, 31) [1] 0.4804
> betainv(0.975, 45, 31)
[1] 0.6992
The posterior probability of hypothesis H : p ≥ 0.8 is 0.00001996, computed by the following Matlab command.
> 1-betacdf(0.8, 45, 31)
[1] 1.9962e-05
(b) With a = b = 1/2, the posterior is Beta(44.5,30.5). The Bayes estimator of p is
.
The 95% credible set of p is found as [0.4809, 0.7011] based on the following matlab command.
> betainv(0.025, 44.5, 30.5) [1] 0.4809
> betainv(0.975, 44.5, 30.5)
[1] 0.7011
The posterior probability of hypothesis H : p ≥ 0.8 is 0.00002479, computed by the following Matlab command.
> 1-betacdf(0.8, 44.5, 30.5)
[1] 2.4792e-05
(c) With a = 9,b = 1, the posterior is Beta(53,31). The Bayes estimator of p is
.
The 95% credible set of p is found as [0.5257, 0.7303] based on the following matlab command.
> betainv(0.025, 53, 31) [1] 0.5257
> betainv(0.975, 53, 31)
[1] 0.7303
The posterior probability of hypothesis H : p ≥ 0.8 is 0.0001927, computed by the following Matlab command.
> 1-betacdf(0.8, 53, 31) [1] 1.9265e-04
3 Penguins.
We first develop Gibbs Sampler that samples from the posterior for µ and τ.
Let Yi where i = 1,…,n denote the measurements of the penguins’ height. We know that
Y1,…,Yn ∼ N(µ,1/τ); µ ∼ N(µ0,1/τ0);
τ ∼ Ga(k,θ).
where µ = 45,τ0 = 1/4, and τ is parameterized by shape parameter k and scale parameter θ. We have k = 4,θ = 2.
The joint distribution is

Thus

which is a kernel of normal distribution. By plugging in the values
and τ0 = 1/4, we know that .
Similarly, we have

which is a kernel of gamma distribution, where the second parameter is a scale parameter. By plugging in the values n = 14,k = 4 and θ = 2, we know that where yi,i = 1,…,14 come from the given data. The Matlab code to perform the question (a) and (b) are attached in Appendix D.
(a) The approximated posterior probability of hypothesis H0 : µ < 45 is 0.9805.
(b) The approximated 95% credible set for τ is [0.1717,0.6023].
A Matlab code for exact calculation of Emily, Car, Stock Market, Sweepstakes, Vacation and Bayes
1 PR = …
2 0.5 * 0.6 * 0.999 * 0.8 * 0.7 + …
3 0.5 * 0.6 * 0.999 * 0.2 * 0.2 + …
4 0.5 * 0.3 * 0.999 * 0.5 * 0.7 + …
5 0.5 * 0.3 * 0.999 * 0.5 * 0.2 + …
6 0.5 * 0.1 * 0.999 * 0.2 * 0.7 + …
7 0.5 * 0.1 * 0.999 * 0.8 * 0.2 + …
8 0.5 * 0.6 * 0.999 * 0.5 * 0.7 + …
9 0.5 * 0.6 * 0.999 * 0.5 * 0.2 + …
10 0.5 * 0.3 * 0.999 * 0.3 * 0.7 + …
11 0.5 * 0.3 * 0.999 * 0.7 * 0.2 + …
12 0.5 * 0.1 * 0.999 * 0.1 * 0.7 + …
13 0.5 * 0.1 * 0.999 * 0.9 * 0.2 + …
14 0.001 * 0.99
15
16 PMR = …
17 0.5 * 0.6 * 0.999 * 0.8 * 0.7 + …
18 0.5 * 0.6 * 0.999 * 0.2 * 0.2 + …
19 0.5 * 0.3 * 0.999 * 0.5 * 0.7 + …
20 0.5 * 0.3 * 0.999 * 0.5 * 0.2 + …
21 0.5 * 0.1 * 0.999 * 0.2 * 0.7 + …
22 0.5 * 0.1 * 0.999 * 0.8 * 0.2 + … 23 0.5 * 0.001 * 0.99
24
25 PRGB = …
26 0.5 * 0.3 * 0.999 * 0.5 * 0.7 + …
27 0.5 * 0.3 * 0.999 * 0.5 * 0.2 + …
28 0.5 * 0.3 * 0.999 * 0.3 * 0.7 + …
29 0.5 * 0.3 * 0.999 * 0.7 * 0.2 + …
30 0.3 * 0.001 * 0.99
31
32 PRC= …
33 0.5 * 0.6 * 0.999 * 0.8 * 0.7 + …
34 0.5 * 0.3 * 0.999 * 0.5 * 0.7 + …
35 0.5 * 0.1 * 0.999 * 0.2 * 0.7 + …
36 0.5 * 0.6 * 0.999 * 0.5 * 0.7 + …
37 0.5 * 0.3 * 0.999 * 0.3 * 0.7 + …
38 0.5 * 0.1 * 0.999 * 0.1 * 0.7 + …
39 0.5 * 0.6 * 0.001 * 0.8 * 0.99 + …
40 0.5 * 0.3 * 0.001 * 0.5 * 0.99 + …
41 0.5 * 0.1 * 0.001 * 0.2 * 0.99 + …
42 0.5 * 0.6 * 0.001 * 0.5 * 0.99 + …
43 0.5 * 0.3 * 0.001 * 0.3 * 0.99 + … 44 0.5 * 0.1 * 0.001 * 0.1 * 0.99

45
46 PLR = 0.001 * 0.99
47
48 format long
49 PR %total probability
50 PMcgR = 1-PMR/PR
51 PGBgR = PRGB/PR
52 PCgR = PRC/PR
53 PLgR = PLR/PR
54 format short
55
56 %P(R) =0.4630275
57 %P(Mc|R) = 0.432576898780310
58 %P(G_B|R) = 0.259546139268186
59 %P(C|R) = 0.794018173866563
60 %P(L|R) = 0.002138101948588

B Matlab code for simulation of Emily, Car, Stock Market, Sweepstakes, Vacation and Bayes
1 s = RandStream(‘mt19937ar’,’Seed’,1);
2 RandStream.setGlobalStream(s);
3 %
4 B=100000;
5 lotteries=[]; cars=[]; markets=[]; gradesB=[]; %save history
6 redingtonh = 1; %hard evidence
7 for i=1:B
8 lottery=rand≤0.001 ; % 1 for won, 0 for not won
9 market = rand ≤ 0.5 ; %1 for bullish, 0 for bearish
10 grade = mnrnd(1,[0.6, 0.3, 0.1],1) ;
11 gradeA=grade(1);
12 gradeB=grade(2);
13 gradeCorless=grade(3);
14
15 if(market)
16 if(gradeA) car=rand≤0.8;
17 elseif(gradeB) car=rand<0.5;
18 else car=rand < 0.2;
19 end
20 else
21 if(gradeA) car=rand≤0.5;
22 elseif(gradeB) car=rand<0.3;
23 else car= rand < 0.1;
24 end
25 end
26
27 if(lottery)
28 redington = rand < 0.99;
29 else
30 if(car) redington=rand≤0.7;
31 else redington=rand≤0.2;
32 end
33 end
34
35
36 %%hard evidence filter
37 if(redington==redingtonh)
38 gradesB=[gradesB gradeB];
39 cars=[cars car];
40 lotteries=[lotteries lottery];
41 markets=[markets market];
42 end;
43 end
44 %(a) Got car

45 mean(cars) %0.7931
46 %(b) Got lottery
47 mean(lotteries) %0.0022
48 %(c) Got grade B
49 mean(gradesB) %0.2614
50 %(d) Market bearish
51 1-mean(markets) %0.4307

C OpenBUGS code for Emily, Car, Stock Market, Sweepstakes, Vacation and Bayes
#Model model { market ~ dcat(p.market[])
grade ~ dcat(p.grade[]) gradeA <- equals(grade,1) gradeB <- equals(grade,2) gradeCorless <- equals(grade, 3)
lottery ~ dcat(p.lottery[]) car ~ dcat(p.car[market, grade,]) redington ~ dcat(p.redington[lottery, car, ])
}
#Data list(redington = 2, p.market=c(0.5, 0.5), p.grade=c(0.6, 0.3, 0.1),
p.lottery=c(0.999, 0.001),
p.car = structure(.Data = c(0.5, 0.5, 0.7, 0.3, 0.9, 0.1,
0.2, 0.8, 0.5, 0.5, 0.8, 0.2),
.Dim=c(2,3,2)),
p.redington = structure(.Data = c(0.8, 0.2, 0.3, 0.7, 0.01, 0.99, 0.01, 0.99),
.Dim=c(2,2,2)) )

D Matlab code for Penguins
2 %full conditional distributions available
3 %
4 % y_i ¬ N(mu, 1/tau), i=1,…,n
5 % mu ¬ N(mu0, 1/tau0); mu0=45, tau0=1/4
6 % tau ¬ Ga(a,1/b); shape=4, rate=1/2
7 %——————————————
8 clear all;
9 close all;
10 clc;
11 %—————–figure defaults
12 lw = 2;
13 set(0, ‘DefaultAxesFontSize’, 17);
14 fs = 14;
15 msize = 5;
16 %——————————–
17 n=14; % sample size
18 randn(‘state’, 10);
19 x=[41 44 43 47 43 46 45 42 45 45 43 45 47 40];
20 suma = sum(x);
21 %——————————————
22 %
23 nn = 10000+1000;
24 mus = [];
25 taus = [];
26 mu = 40; tau =8; % start with the chain the parameters as prior means
27 mu0=45; tau0=1/4;
28 h=waitbar(0,’Simulation in progress’);
29 for i = 1 : nn
new_mu = normrnd( (tau * suma+tau0*mu0)/(tau0+n*tau), …
sqrt(1/(tau0+n*tau)) );
33 par = 1/2 + 1/2 * sum ( (x – mu).^2 );
34 new_tau =gamrnd(4 + n/2, 1/par);
35 mus = [mus new_mu];
36 taus = [taus new_tau];
37 tau=new_tau;
38 mu=new_mu;
39 if i/50==fix(i/50) % Shows wait bar
40 waitbar(i/nn)
41 end
42 end
43 close(h)
44 %
45 burnin = 1000;

46 figure(1)
47 subplot(2,1,1)
48 plot((nn-burnin:nn), mus(nn-burnin:nn))
49 xlabel(‘Mu’)
50 subplot(2,1,2)
51 plot((nn-burnin:nn), taus(nn-burnin:nn))
52 xlabel(‘Tau’)
53
54 figure(2)
55 subplot(2,1,1)
56 hist(mus(burnin:nn), 70)
57 title(‘Mu’);
58 subplot(2,1,2)
59 hist(taus(burnin:nn), 70)
60 title(‘Tau’);
61
62 figure(3)
63 plot( mus(burnin:nn), taus(burnin:nn), ‘.’)
64 xlabel(‘Mu’);
65 ylabel(‘Tau’);
66 title(‘Scatter plot of new mu and new tau’);
67 mean(mus(burnin:nn)) %44.0477
68 mean(taus(burnin:nn)) %0.3557
69
70 mean(1./taus(burnin:nn)) %3.1095
71 %
72 %Posterior mean using gibbs sampler
73 %After burnin 500 records
74 [mean(mus(burnin:nn)) std(mus(burnin:nn)) prctile(mus(burnin:nn),2.5) … median(mus(burnin:nn)) prctile(mus(burnin:nn),97.5)]
75 % 44.0493 0.4604 43.1308 44.0504 44.9601
76
77 %Posterior precision (1/sig2) using gibbs sampler
78 %After burnin 500 records
79 [mean(taus(burnin:nn)) std(taus(burnin:nn)) … prctile(taus(burnin:nn),2.5) median(taus(burnin:nn)) …
prctile(taus(burnin:nn),97.5)]
80 %0.3554 0.1095 0.1735 0.3452 0.5995
81
82 %(a)
83 sum(mus(burnin:nn) < 45)/(nn-burnin)
84 %0.9805
85 %(b)
86 [prctile(taus, 2.5) prctile(taus, 97.5)]
87 % 0.1717 0.6023

Reviews

There are no reviews yet.

Be the first to review “ISYE6420 – 1 Emily, Car, Stock Market, Sweepstakes, Vacation and Bayes. Solved”

Your email address will not be published. Required fields are marked *