Skip to main content
留学咨询

辅导案例-MAST20006

By July 26, 2020No Comments

So lut ion s ! Student ID: The University of Melbourne Semester 1 Exam Solutions — June, 2019 School of Mathematics and Statistics MAST20006 Probability for Statistics Exam Duration: 3 Hours Reading Time: 15 Minutes This paper has 9 pages Authorised materials: This is a closed book exam. A University approved hand-held calculator, i.e. Casio FX82 (with any suffix) may be used. Instructions to Invigilators: Script books shall be supplied to each student. Students may not take this paper with them at the end of the exam. Instructions to Students: This paper has 10 questions. Formula sheet is given at the end of this paper. Attempt as many questions, or parts of questions, as you can. Questions carry marks as shown in the brackets after the questions statements. The total number of marks available for this examination is 100. Working and/or reasoning must be given to obtain full credit. This paper may be reproduced and lodged at the Baillieu Library. Page 1 of 9 So lut ion s 1. Voters living in the city vote for candidate A with probability 0.6, while voters living in the country vote for candidate A with probability 0.4. It is known that 80% of all voters live in the city and the rest 20% live in the country. (a) Compute the probability that a randomly chosen voter will vote for candidate A. [3] • P (vote A) = 0.8× 0.6 + 0.2× 0.4 = 0.56. (b) Compute the probability that a person who will vote for candidate A lives in the city. [3] • Thus P (city | vote A) = 0.8×0.6 0.56 = 6 7 = 0.8571. (c) Four randomly surveyed voters indicated they would vote for candidate A. Let X be the number of voters among these four who live in the city. Compute the probability P (X = 3). [3] • X d= Bin(4, 6/7). • Thus P (X = 3) = 4× (6/7)3 × (1/7) = 864/2401 = 0.35985. 2. Suppose X1 and X2 are two independent random variables that have the following moment generating function (mgf): M1(t) = E ( etX1 ) = M2(t) = E ( etX2 ) = 0.5et 1− 0.5et , t < ln 2. (a) Compute the probability P (min(X1, X2) > 2). [3] • X1 d= X2 d= Geometric(p = 0.5) • Since X1 and X2 are i.i.d., P (min(X1, X2) > 2) = P (X1 > 2)P (X2 > 2) = ((1− 0.5)2)2 = 0.0625. (b) Define Y = X1 +X2. Compute the probability P (Y = 4). [3] • Mgf of Y is MY (t) = (0.5e t) 2 (1− 0.5et)2 , t < ln 2. Hence Y d =NB(r=2, p=0.5). • P (Y = 4) = (3 1 ) · 0.52 · 0.54−2 = 3 16 = 0.1875. (c) Compute the probability P (Y > 7). [3] • Define Z d= Bin(7, 0.5). Then P (Y > 7) = P (Z < 2). • So P (Y > 7) = P (Z = 0)+P (Z = 1) = 0.57 +7 ·0.57 = 8 128 = 1 16 = 0.0625. (d) Compute Var(0.5Y + 80). [3] • Var(0.5Y + 80) = E(0.25Y )− [E(0.5Y )]2 = MY (ln 0.25)− [MY (ln 0.5)]2 = (0.5× 0.25)2 (1− 0.5× 0.25)2 − ( (0.5× 0.5)2 (1− 0.5× 0.5)2 )2 = 1 49 − 1 81 = 32 3969 = 0.00806 Page 2 of 9 So lut ion s 3. Mutation in a certain gene can occur with probability 0.001 in human population. Suppose X people in a random sample of 2500 people will be observed to have this mutation. Note that X follows a binomial distribution. (a) Compute P (X ≤ 2). [2] • P (X ≤ 2) = P (X = 0) + P (X = 1) + P (X = 2) = 0.9992500 + 2500× 0.001× 0.9992499 + (2500 2 )× 0.0012 × 0.9992498 = 0.08198 + 0.20516 + 0.25661 = 0.54375. (b) A binomial distribution b(n, p) can be approximated by a Poisson(λ = np) distribution if p is small and n is large. Use this result to approximate the probability in part (a) by a Poisson probability. [2] • X d≈ Poisson(λ = 2.5), thus P (X ≤ 2) ≈ e−2.5 + 2.5e−2.5 + 2.5 2 2! e−2.5 = 6.625e−2.5 = 0.54381. (c) The probability in part (a) may also be approximated by a normal probability based on the central limit theorem. Give a normal approximation (using the continuity correction) to P (X ≤ 2). [2] • By CLT, X d≈ N(µ = 2.5, σ2 = 2.4975). • So P (X ≤ 2)≈P ( Z ≤ 2+0.5−2.5√ 2.4975 ) =P (Z ≤0)= Φ(0) = 0.5. 4. Let X1 and X2 be two independent Bernoulli(p = 0.5) random variables. Define two new random variables: Y1 = min(X1, X2) and Y2 = max(X1, X2). (a) Compute the joint probability mass function (pmf) of (Y1, Y2). [3] • Possible values of (Y1, Y2): (0, 0), (0, 1), (1, 1). Then the pmf f(y1, y2): • f(0, 0) = P (Y1 = Y2 = 0) = P (X1 = X2 = 0) = 0.25; • f(0, 1) = P (Y1 = 0, Y2 = 1) = P (X1 = 0, X2 = 1) + P (X1 = 1, X2 = 0) = 0.25 + 0.25 = 0.5. • f(1, 1) = P (Y1 = Y2 = 1) = P (X1 = X2 = 1) = 0.25. (b) Compute E(Y1), E(Y2), Var(Y1) and Var(Y2). [4] • E(Y1) = P (Y1 = 1) = P (X1 = X2 = 1) = 0.25. • E(Y2) = P (Y2 = 1) = 1− P (Y2 = 0) = 1− P (X1 = X2 = 0) = 0.75. • Y1 d= b(1, 0.25). So Var(Y1) = 0.25 · 0.75 = 3/16 = 0.1875. • Y2 d= b(1, 0.75). So Var(Y2) = 0.75 · 0.25 = 3/16 = 0.1875. (c) Compute Cov(Y1, Y2). Are Y1 and Y2 independent? Why or why not? [3] • Cov(Y1, Y2) = E(Y1Y2) − E(Y1)E(Y2) = f(1, 1) − 0.25 · 0.75 = 1/16 = 0.0625. • Y1 and Y2 are not independent since they are correlated. Page 3 of 9 So lut ion s 5. Let X be a continuous random variable with probability density function (pdf) f(x) =  c if − 4 < x < 0 2c if 0 < x < 2 0 elsewhere where c is a constant with its value to be determined. (a) Find the value of c and the cumulative distribution function (cdf) of X. [2] • Solving 1 = ∫ 0−4 cdx+ ∫ 20 2cdx = 4c+ 4c, it follows that c = 18 • Hence the cdf of X is F (x) =  0 if x ≤ −4 x+4 8 if −4 < x < 0 x+2 4 if 0 ≤ x < 2 1 if x ≥ 2 (b) Let X1 and X2 be two independent random variables each having the pdf f(x) given above. Define W = min{X1, X2}. Find the 75-th percentile of W . [3] • P (W > w) = P (min{X1, X2} > w) = P 2(X > w). The cdf of W is FW (w) = 1− (1− F (w))2 =  0 if w ≤ −4 1− [1− w+4 8 ]2 if −4 < w < 0 1− [1− w+2 4 ]2 if 0 ≤ w < 2 1 if w ≥ 2 • Solving 3 4 = Fw(pi0.75), it follows that pi0.75 = 0. (c) Consider the transformation Y = X2 of X. i. Is this transformation one-to-one? Find the support of Y . [2] • This is not one-to-one. The support of Y is 0 ≤ y < 16. ii. Derive the cdf of Y . [3] • For 0 ≤ y ≤ 4, G(y) = P (−√y ≤ X ≤ √y) = ∫ 0−√y 18dx+ ∫ √y0 28dx = 3√y8 . • For 4 < y < 16, G(y) = P (−√y ≤ X ≤ 2) = ∫ 0−√y 18dx+ ∫ 20 28dx = √y8 + 12 . • So the cdf of Y is G(y) =  0, y < 0, 3 8 √ y, 0 ≤ y ≤ 4, 1 8 √ y + 1 2 , 4 < y < 16, 1, y ≥ 16. iii. Compute the pdf of Y . [2] • g(y) = G′(y) =  0, y < 0, 3 16 √ y , 0 ≤ y ≤ 4, 1 16 √ y 4 < y < 16, 0, y ≥ 16. Page 4 of 9 So lut ion s 6. LetX1, X2, X3 be independent random variables having Bernoulli(p = 0.5), Bernoulli(p = 0.5) and Poisson(λ = 0.75) distributions, respectively. Define Y1 = X1 +X3 and Y2 = X2 +X3. (a) Compute the correlation coefficient ρ between Y1 and Y2. [2] • Var(Y1) = Var(X1) + Var(X3) = 0.25 + 0.75 = 1. Similarly, Var(Y2) = 1. • Cov(Y1, Y2) = Var(X3) = 0.75. So ρ = Cor(Y1, Y2) = 0.75 1 = 0.75. (b) Use Chebyshev’s inequality P (|X − µ| < kσ) ≥ 1 − 1 k2 to find a lower bound for P (|Y1 − 1.25| < √ 3). [1] • E(Y1) = E(X1) + E(X3) = 1.25 and Var(Y1) = 1. So σ = 1 • P (|Y1 − 1.25| < √ 3) = P (|Y1 − 1.25| < √ 3σ) ≥ 1− 1 3 = 0.6˙. (c) Compute the exact value of P (|Y1 − 1.25| < √ 3). [3] • |Y1 − 1.25| < √ 3 ⇔ 0 ≤ Y1 ≤ 2. Hence • P (|Y1 − 1.25| < √ 3) = P (0 ≤ Y1 ≤ 2) = P (X1 = 0)P (0 ≤ Y1 ≤ 2|X1 = 0) + P (X1 = 1)P (0 ≤ Y1 ≤ 2|X1 = 1) = P (X1 = 0)P (X3 = 0, 1, 2) + P (X1 = 1)P (X3 = 0, 1) = 0.5(0.750 + 0.75 + 0.752/2)e−0.75 + 0.5(0.750 + 0.75)e−0.75 = 0.8931. (d) Define Z1 = { 1 if Y1 = 0, 0 otherwise; and Z2 = { 1 if Y2 = 0, 0 otherwise. i. Compute P (Z1 = 1) and P (Z2 = 1). [2] • Both Z1 and Z2 are Bernoulli r.v.s with P (Z1 = 1) = P (Z2 = 1) = P (X1 = X3 = 0) = P (X2 = X3 = 0) = 0.5e −0.75 = 0.2362. ii. Compute the joint pmf of (Z1, Z2). [2] • P (Z1 = 1, Z2 = 1) = P (X1 = X2 = X3 = 0) = 0.25e−0.75 = 0.1181. • P (Z1 = 1, Z2 = 0) = P (X1 = X3 = 0, X2 > 0) = 0.25e−0.75 = 0.1181. • P (Z1 = 0, Z2 = 1) = P (X1 > 0, X2 = X3 = 0) = 0.25e−0.75 = 0.1181. • P (Z1 = 0, Z2 = 0) = 1− 3 · 0.25e−0.75 = 0.6457. Pag
e 5 of 9 So lut ion s 7. Suppose X and Y are continuous random variables with the joint pdf f(x, y) = { 3 if 0 ≤ x ≤ 1 and 0 ≤ y ≤ x2, 0 elsewhere. (a) Find the marginal pdf of X. Is the marginal pdf of X a uniform pdf? [2] • The marginal of X is f1(x) = ∫ x2 0 3dy = 3×2, 0 ≤ x ≤ 1. • f1(x) is not a uniform pdf. (b) Find the conditional pdf of Y given X = x, 0≤x≤1. Is it a uniform pdf? [2] • h(y|x) = f(x, y) f1(x) = 3 3×2 = 1 x2 , 0 ≤ y ≤ x2; 0 ≤ x ≤ 1. • So h(y|X = x) is a Uniform(0, x2) pdf. (c) Are X and Y independent? Why or why not? [1] • X and Y are not independent because h(y|x) depends on x, or because the support of (X, Y ) is not rectangular. (d) Compute the conditional expectation E [ 2Y ∣∣X = 2−1/2 ]. [2] • E [2Y ∣∣X = 2−1/2 ] = ∫ 1/2 0 2y (2−1/2)2 dy = 2 √ 2− 2 ln 2 = 1.1952. • Alternatively, use the mgf of Uniform(0, 2−1) to get the same answer. (e) Compute the probability P (Y ≥ X3). [3] • P (Y ≥ X3) = ∫ 1 0 ∫ x2 x3 3dydx = ∫ 1 0 3(x2 − x3)dx = { x3 − 3 4 x4 }1 0 = 1 4 . Page 6 of 9 So lut ion s 8. Consider two random variables X1 and X2 with the joint probability density f(x1, x2) = { 2, 0 ≤ x1 ≤ x2 ≤ 1, 0 elsewhere. Let Y1 = X1X2 and Y2 = X2 be a joint transformation of (X1, X2). (a) Find the support of (Y1, Y2) and sketch it. [3] • The support of (Y1, Y2) is {0 ≤ Y1 ≤ Y 22 ≤ 1}. 0.0 0.2 0.4 0.6 0.8 1.0 0. 0 0. 2 0. 4 0. 6 0. 8 1. 0 x1 x2 S: 00.0 0.2 0.4 0.6 0.8 1.0 0. 0 0. 2 0. 4 0. 6 0. 8 1. 0 y1 y2 S1: 0(b) Find the inverse transformation. [1] • X1 = Y1/Y2, X2 = Y2. (c) Compute the Jacobian of the inverse transformation. [2] • J = ∣∣∣∣ 1/y2 −y1/y220 1 ∣∣∣∣ = 1y2 . (d) Compute the joint pdf of (Y1, Y2). [2] • g(y1, y2) = |J | · f(y1 y2 , y2) = 2 y2 , 0 ≤ y1 ≤ y22 ≤ 1. (e) Find the marginal pdf of Y1 from the joint pdf of (Y1, Y2). [2] • g1(y1) = ∫ 1 √ y1 2 y2 dy2 = −2 ln√y1 = − ln y1, 0 < y1 < 1. Page 7 of 9 So lut ion s 9. Let X1, X2, · · · , Xn be independent random variables each having the moment- generating function (mgf) M(t) = 8− 3t (2− t)(4− t) , t < 2. (a) Compute the mgf MYn(t) of the sum Yn = X1 +X2 + · · ·+Xn. [2] • MYn(t) = E(etYn) = [MX1(t)]n = (8− 3t)n (2− t)n(4− t)n , t < 2. (b) Compute the mgf MY¯n(t) of the sample mean Y¯n = Yn n . [2] • MY¯n(t) = MYn ( t n ) = (8− 3t n )n (2− t n )n(4− t n )n = (1− 3t 8n )n (1− t 2n )n(1− t 4n )n , t < 2n. (c) Compute the limiting mgf limn→∞MY¯n(t). What distribution does the limiting mgf correspond to? What is the implication of this result? [2] • lim n→∞ MY¯n(t) = limn→∞ (1− 3t 8n )n (1− t 2n )n(1− t 4n )n = e−3t/8 e−t/2e−t/4 = e3t/8 • The limit is the mgf of the degenerate distribution having probability 1 at 3 8 . • This implies that Y¯n p→ 38 = E(X1) as n→∞. • Note it is also a correct answer if it is based on applying the WLLN. Then the WLLN must be correctly stated, and E(X1) = 3 8 be proved. (d) Let Zn = √ n(Y¯n − 3 8 ). Compute MZn(t), the mgf of Zn. Then use this result to compute limn→∞MZn(t). Finally explain what is the limiting distribution of Zn as n→∞. [5] • MZn(t) = E(et √ n(Y¯n−3/8)) = e−3t √ n/8MY¯n( √ nt) = e−3t √ n/8(1− 3 √ nt 8n )n (1− √ nt 2n )n(1− √ nt 4n )n = [ e−3t/(8 √ n)(1− 3t 8 √ n ) 1− 3t 4 √ n + t 2 8n ]n , t < 2 √ n. • By Taylor’s series expansion, eu ≈ 1 + u + 1 2 u2 when |u| is small. Using this result, for any given t, lim n→∞ MZn(t) = lim n→∞ [ e−3t/(8 √ n)(1− 3t 8 √ n ) 1− 3t 4 √ n + t 2 8n ]n = lim n→∞ [ (1− 3t 8 √ n + 9t 2 128n )(1− 3t 8 √ n ) 1− 3t 4 √ n + t 2 8n ]n = lim n→∞ [ 1− 3t 4 √ n + 27t 2 128n 1− 3t 4 √ n + t 2 8n ]n = lim n→∞ ( 1 + 11t2 128n )n = e11t 2/128 • The limit mgf is that of N(µ = 0, σ2 = 11 64 ). • This implies that Zn d→ N(0, 1164) as n→∞. • Note it is also a correct answer if it is based on applying the CLT. Then the CLT must be correctly stated, and Var(X1) = 11 64 be proved. Page 8 of 9 So lut ion s 10. A random variable X has the following mgf: M(t) = e−t(1− 2t) (1− t)(1− 4t) , t < 1 4 . (a) Compute the value of Var(X). [5] • It can be shown that M(t) = e−t [ 1 3 · 1 1− t + 2 3 · 1 1− 4t ] . • Thus X + 1 has probability 1 3 to follow an Exp(1) distribution, and proba- bility 2 3 to follow an Exp(4) distibution. • Hence E((X+1)2) = 1 3 ·2·12 + 2 3 ·2·42 = 22, and E(X+1) = 1 3 ·1+ 2 3 ·4 = 3. • Therefore, Var(X) = Var(X + 1) = 22− 32 = 13. (b) Compute the probability P (X2 > 4). [5] • It can be seen that P (X2 > 4) = P (X > 2 or X < −2) = P (X + 1 > 3). • Because X + 1 is a mixture exponential r.v. from (a), P (X2 > 4) = 1 3 ∫ ∞ 3 e−xdx+ 2 3 ∫ ∞ 3 1 4 e−x/4dx = 1 3 e−3 + 2 3 e−3/4 = 0.3315 Total marks = 100 End of the exam questions. Formulas are on the next page. Page 9 of 9

admin

Author admin

More posts by admin