Description
Q1 (6pts): A friend informs you that a casino is using loaded dice, such that:
1112 ๐๐ ๐ โ {1, 2, 3}
๐๐๐๐๐๐! = ๐๐๐๐ ๐๐ ๐ โ {4, 5, 6}
0 ๐๐กโ๐๐๐ค๐๐ ๐
Q1a: What is the entropy of a roll at this casino? Please use log base 2
Q1b: Imagine your friend is right, but you choose to give the casino the benefit of the doubt and assume fair dice. Whatโs the KL divergence of fair dice (Q) from the true
distribution (P)? i.e. calculate ๐ท”#(๐๐๐๐๐๐ ๐๐๐๐||๐๐๐๐ ๐๐๐๐)
Please use log base 2
Q1c: Imagine you choose to believe your friend, but it turned out the casino has since switched back to fair dice. Whatโs the KL divergence of the loaded dice (Q) from the true distribution (P)? i.e. calculate ๐ท”#(๐๐๐๐ ๐๐๐๐||๐๐๐๐๐๐ ๐๐๐๐)
Please use log base 2
Q2 (5pts): Given this ANN structure:
And the following parameter/function definitions:
W = [-15, -3, -2, 4, 1, 10] B = [4, 1, -0.5]
๐$(๐ฅ) = ๐%(๐ฅ) = max (0.1๐ฅ, ๐ฅ) ๐&(๐ฅ) = ๐ฅ%
What are the intermediate and/or output values for the following data points?
Q2a: Data point: a = 0.5, c = 0.5
๐โฒ$ value: ๐โฒ% value:
๐โฒ$ value:
Q2b: Data point: a = 1, c = 0
๐โฒ% value: ๐$ value:
๐% value:
Q2c: Data point: a = 0, c = 1
๐ฅโฒ$ value: ๐ฅโฒ% value:
๐& value:
Q3 (4pts): Given a test data point:
Height = 200
Weight = 200
And the training dataset in the table below, use kNN classification with k=1, k=3, and k=5 to label the test data point. Break ties by increasing k by 1.
Show your work by filling in the table and writing in the modelโs class label predictions.
Class Height Weight Manhattan Distance from test sample
1 105 114
1 92 169
1 87 140
2 111 109
2 79 44
2 92 55
3 265 331
3 330 284
3 185 309
Model predictions for:
k = 1 _________ k = 3 _________ k = 5 _________
Reviews
There are no reviews yet.