Student Number Semester 1 Assessment, 2019 School of Mathematics and Statistics MAST20004 Probability Writing time: 3 hours Reading time: 15 minutes This is NOT an open book exam This paper consists of 16 pages (including this page) Authorised Materials • Mobile phones, smart watches, and internet or communication devices are forbidden. • Students may bring one double-sided A4 sheet of handwritten notes into the exam room. • Approved hand-held electronic scientific (but not graphing) calculators may be used. Instructions to Students • You must NOT remove this question paper at the conclusion of the exam. • This paper has 10 questions. Attempt as many questions, or parts of questions, as you can. Marks for individual questions are shown. • Working and/or reasoning must be given to obtain full credit. Clarity, neatness, and style count. • Statistical tables are not provided but you may use the MATLAB output at the end of the examination paper FOR ANY QUESTION. • The total number of marks available for this exam is 110. Instructions to Invigilators • Students must NOT remove this question paper at the conclusion of the exam. • Initially students are to receive a 14 page script booklet. This paper may be held in the Baillieu Library Blank page (ignored in page numbering) MAST20004 Semester 1, 2019 1. Consider a random experiment with sample space Ω. (a) Write down the axioms which must be satisfied by a probability mapping P defined on the events of the experiment. (b) Using the axioms show that for any event A ⊂ Ω, P(Ac) = 1− P(A). (c) Using the axioms show that for any events A,B ⊂ Ω such that A ⊂ B, P(B\A) = P(B) − P(A). (Recall that B\A is the event that B and not A will occur, that is, B ∩Ac.) (d) Let C,D ⊂ Ω be events. Using (c) and the axioms, show that the probability of exactly one of these events occurring is P(C) + P(D)− 2P(C ∩D). [10 marks] Solution (a) The axioms are A1 For all events A, P(A) ≥ 0; A2 P(Ω) = 1; A3 For disjoint events A1, A2, . . ., P ( ∞⋃ i=1 Ai ) = ∞∑ i=1 P(Ai). (b) We first note that A ∪Ac = Ω and A ∩Ac = ∅. By A2 and A3 we have 1 = P(Ω) = P(A ∪Ac) = P(A) + P(Ac). Therefore, P(Ac) = 1− P(A). (c) We have B = A ∪ (B ∩Ac) and A ∩ (B ∩Ac) = ∅. Therefore by A3, P(B) = P (A ∪ (B ∩Ac)) = P(A) + P (B ∩Ac) . Therefore, P(B\A) = P(B)− P(A). (d) We note that the events C\(C ∩D) and D\(C ∩D) are mutually exclusive, and the event that exactly one event will occur is (C\(C ∩D)) ∪ (D\(C ∩D)). Also, C ∩D ⊂ C and C ∩D ⊂ D. Therefore by A3 and (c), P((C\(C ∩D)) ∪ (D\(C ∩D))) = P((C\(C ∩D)) + P(D\(C ∩D)) = P(C)− P(C ∩D) + P(D)− P(C ∩D) = P(C) + P(D)− 2P(C ∩D). 2. A box contains ten coins, of which three coins are of type one, four coins are of type two, and three coins are of type three. Type one coins are fair, type two coins are weighted so that when tossed show a head 70% of the time, and type three coins show a tail 60% of the time. A coin is selected randomly from the box and then tossed. (a) What is the probability that a tail is showing? Page 2 of 16 pages MAST20004 Semester 1, 2019 (b) If a tail is showing, what is the probability that the coin is of type three? (c) Are the events that a tail is showing and a type three coin was selected positively related, negatively related, or independent? Justify your answer. (d) Given a tail is showing, and you toss the same coin again, what is the probability that a head will be showing? [10 marks] Solution For i = 1, 2, 3, let Ci be the event that a coin of type i was selected. Let T be the event that a tail is showing. (a) By the law of total probability, P(T ) = P(T |C1)P(C1) + P(T |C2)P(C2) + P(T |C3)P(C3) = 0.5× 0.3 + 0.3× 0.4 + 0.6× 0.3 = 0.45. (b) By Bayes’ theorem P(C3|T ) = P(T |C3)P(C3)P(T ) = 0.6× 0.3 0.45 = 0.4. (c) Since P(C3|T ) > P(C3), or P(T |C3) > P(T ), the events C3 and T are positively related. (d) Let B be the event that the second toss is a head. Using the law of total probability we get P(B|T ) = P(B ∩ T ) P(T ) = P(B ∩ T |C1)P(C1) + P(B ∩ T |C2)P(C2) + P(B ∩ T |C3)P(C3) P(T ) = 0.5× 0.5× 0.3 + 0.7× 0.3× 0.4 + 0.4× 0.6× 0.3 0.45 = 0.5133. 3. Let the probability mass function of X be given by pX(k) = c · 2 k k! , k = 1, 2, . . . , for some constant c. (a) What is the value of the constant c? (b) Which is (are) the most probable value(s) of X? Justify your answer. (c) Derive an expression without a summation for the probability generating function PX of X. For which values z ∈ R is PX defined? Justify your answer. Page 3 of 16 pages MAST20004 Semester 1, 2019 (d) Using (c), or otherwise, calculate E[X] and V (X). (e) Is X more likely to be even or odd? Justify your answer. [13 marks] Solution (a) We have that ∞∑ k=1 c · 2 k k! = 1 =⇒ c(e2 − 1) = 1 =⇒ c = (e2 − 1)−1. (b) For k = 1, 2, . . ., we have r(k) = pX(k + 1) pX(k) = 2 k + 1 . Since r(1) = 1, and for k = 2, 3, . . ., r(k) < 1, the most probable values of X are 1 and 2. (c) We have that PX(z) = E[zX ] = (e2 − 1)−1 ∞∑ k=1 2k k! zk = e2z − 1 e2 − 1 . PX(z) is defined for all z ∈ R since the sum converges for all z ∈ R. (d) Since P ′X(z) = 2e2z e2 − 1 and P ′′X(z) = 4e2z e2 − 1 , we have that E[X] = P ′X(1) = 2e2 e2 − 1 , and V (X) = P ′′X(1) + P ′ X(1)− ( P ′X(1) )2 = 4e2 e2 − 1 + 2e2 e2 − 1 − 4e4 (e2 − 1)2 Page 4 of 16 pages MAST20004 Semester 1, 2019 = 2e2(e2 − 3) (e2 − 1)2 . Alternative solution E[X] = (e2 − 1)−1 ∞∑ k=1 k 2k k! = 2(e2 − 1)−1 ∞∑ k=1 2k−1 (k − 1)! = 2e2 e2 − 1 . Also E[X(X − 1)] = (e2 − 1)−1 ∞∑ k=2 k(k − 1)2 k k! = 4(e2 − 1)−1 ∞∑ k=2 2k−2 (k − 2)! = 4e2 e2 − 1 , and V (X) = E[X(X − 1)] + E[X]− {E[X]}2 = 2e2(e2 − 3) (e2 − 1)2 . (e) Using PX(z), P(X is even) = 1 2 (PX(1) + PX(−1)) = e2 − 2 + e−2 2(e2 − 1) = (e2 − 1)(1− e−2) 2(e2 − 1) = 1− e−2 2 . Now P(X is odd) = 1− P(X is even) = 1 + e−2 2 > P(X is even), so X is more likely to be odd than even. Page 5 of 16 pages MAST20004 Semester 1, 2019 Alternative solution We have that P(X is even) = (e2 − 1)−1 ∞∑ j=1 22j (2j)! = (e2 − 1)−1 ∞∑ i=1 (−1)i2i + (−1)i2i 2 · i! = (e2 − 1)−1 2 ( e2 − 2 + e−2) = (e2 − 1)−1 2 (e2 − 1)(1− e−2) = 1− e−2 2 . Now P(X is odd) = 1− P(X is even) = 1 + e−2 2 > P(X is even), so X is more likely to be odd than even. 4. A factory has ten machines that are all operational at the beginning of an eight hour shift. Each machine relies on a critical component which carries a backup spare in the machine. The lifetimes of the components and their backup spares are independent and distributed according to an exponential distribution with a mean of four hours. Once a component fails, its backup spare immediately replaces it, but if the backup spare also fails, the machine is inoperative. Calculate the probability that at the end of a shift, at least two machines are operational. [8 marks] Solution Let X be the lifetime of a machine. Thus, X d = γ ( 2, 14 ) . The probability that a machine is working at the end of a shift is P(X ≥ 8) = ∫ ∞ 8 ( 1 4 )2 xe−x/4 Γ(2) dx = 1 1! × 1 16 [ −4xe−x/4 − 16e−x/4 ]∞ 8 = 3e−2 = 0.406. Let Y be the number of machines that are operational at the end of the shift. Then, Y d = Bi(10, 0.406). The probability that at least two machines are operational at the end of a shift is P(Y ≥ 2) = 1− P(Y = 0)− P(Y = 1) Page 6 of 16 pages MAST20004 Semester 1, 2019 = 1− 0.59410 − 10× 0.406× 0.5949 = 0.9572. 5. Let X d = exp(1), Y = min(X, 2), and Z = eX/3. (a) Find the cumulative distribution function FY of Y . (b) Is Y a discrete random variable, or a continuous random variable, or neither? Justify your answer. (c) Calculate E[Y ]. (d) Find the cumulative distribution function FZ , and the probability density function fZ , of Z. Identify the distribution of Z. [10 marks] Solution (a) We have that FY (y) = 0, y < 0, 1− e−y, 0 ≤ y < 2, 1, y ≥ 2. (b) Since there is a jump in FY at y = 2, it is not continuous. On the other hand, Y takes uncountably many possible values and has positive density in [0, 2), so it is not discrete. Hence, FY is neither discrete nor continuous. (c) We have that E[Y ] = ∫ 2 0 ye−ydy + 2 ∫ ∞ 2 e−ydy = [−ye−y − e−y]2 0 + 2 [−e−y]∞ 2 = −2e−2 − e−2 + 1 + 2e−2 = 1− e−2, or E[Y ] = ∫ ∞ 0 (1− FY (y))dy = ∫ 2 0 e−ydy = 1− e−2, (d) We have, for z ≥ 1, FZ(z) = P(Z ≤ z) = P ( eX/3 ≤ z ) = P (X ≤ 3 log z) Page 7 of 16 pages MAST20004 Semester 1, 2019 = 1− e−3 log z = 1− ( 1 z )3 . FZ(z) = 0 when z < 1. Also, fZ(z) = 3 ( 1 z )4 , z ≥ 1, 0, z < 1. Thus, Z d = Pareto(1, 3). 6. Let X d = exp(3) with pdf fX(x) = { 3e−3x, x ≥ 0, 0, x < 0, and Y = ψ(X) = eX . (a) Find the cdf of X and using the formula for computing the moments through the tail probabilities derive µ := E[X] and E[X2]. Evaluate V (X). (b) Calculate E[Y ] and V (Y ). (c) Compute the approximate values of E[Y ] and V (Y ) using E[ψ(X)] ≈ ψ(µ) + 1 2 ψ′′(µ)V (X) and V (ψ(X)) ≈ ψ′(µ)2V (X). Do you expect good approximations? Justify your answer. (d) The “rand” command in Matlab can be used to generate a realisation (an observation) of a random variable U d = R(0, 1). Explain how to generate a realisation of the random variable X. Explain how to calculate an estimate of E(sin(X)) from 100 realisations of the random variable U d = R(0, 1). [15 marks] Solution (a) The cdf is FX(x) = { 1− e−3x, x ≥ 0, 0, x < 0. Hence, µ = ∫ ∞ 0 (1− FX(x))dx = ∫ ∞ 0 e−3xdx = 1 3 , E(X2) = 2 ∫ ∞ 0 x(1− FX(x))dx = 2 ∫ ∞ 0 xe−3xdx = − [ 2 3 xe−3x ]∞ 0 + 2 3 ∫ ∞ 0 e−3xdx = 2 9 , V (X) = E(X2)− [E(X)]2 = 1 9 . Page 8 of 16 pages MAST20004 Semester 1, 2019 (b) Direct computation gives E(Y ) = ∫ ∞ 0 ex · 3e−3xdx = 3 2 ∫ ∞ 0 2e−2xdx = 1.5, E(Y 2) = ∫ ∞ 0 e2x · 3e−3xdx = 3 ∫ ∞ 0 e−xdx = 3, V (Y ) = E(X2)− [E(X)]2 = 3− 1.52 = 0.75. (c) Since ψ(x) = ex and µ = 13 , we have ψ ′(x) = ψ′′(x) = ex, so E(Y ) ≈ e1/3 + 1 2 e1/3 · 1 9 = 1.4731, V (Y ) ≈ e2/3 · 1 9 = 0.2164. Since the function ψ(x) = ex increases very fast when x → ∞ and the distribution of X has a relatively long tail, the approximations are expected to be bad, especially for the variance. (d) Let FX(x) = u, we have x = F −1 X (u) = −13 ln(1−u), hence X can be simulated using X d = −1 3 ln(1− U) or X d = −1 3 ln(U), where the last equality is because of the fact that 1 − U d= R(0, 1). This also gives, for 100 realisations Ui, 1 ≤ i ≤ 100, of R(0, 1), E(sin(X)) ≈ 1 100 100∑ i=1 sin ( −1 3 ln(Ui) ) . 7. Consider the bivariate random variable (X,Y ) which has joint probability density function f(X,Y )(x, y) = 1 2 , for 0 < x, y < 1, 1 2 , for − 1 < x, y ≤ 0, 0, elsewhere. (a) Derive the marginal probability density functions for X and Y . (b) Evaluate the following probabilities: (i) P(X > 0, Y > 0); (ii) P(X > 1/2, Y < 1/2); (iii) P(X + Y ≤ 1); (iv) P(X2 + Y 2 ≤ 1). (c) Are X and Y independent? Justify your answer. [9 marks] Solution Page 9 of 16 pages MAST20004 Semester 1, 2019 (a) fX(x) = 0 for x ≤ −1 or x ≥ 1. For −1 < x ≤ 0, fX(x) = ∫ 0 −1 1 2 dy = 1 2 . For 0 < x < 1, fX(x) = ∫ 1 0 1 2 dy = 1 2 . Hence X d = R(−1, 1). By symmetry, Y d= R(−1, 1). (b) (i) P(X > 0, Y > 0) = ∫ 1 0 ∫ 1 0 1 2dxdy = 1 2 ; (ii) P(X > 1/2, Y < 1/2) = ∫ 1 1/2 (∫ 1/2 0 1 2dy ) dx = 18 ; (iii) P(X + Y ≤ 1) = 1− P(X + Y > 1) = 1− ∫ 10 (∫ 11−x 12dy) dx = 1− ∫ 10 x2dx = 34 ; (iv) By symmetry, P(X2 + Y 2 ≤ 1) = 2P(X2 + Y 2 ≤ 1, X > 0, Y > 0) = 2 ∫ 1 0 (∫ √1−x2 0 1 2 dy ) dx = ∫ 1 0 (∫ √1−x2 0 dy ) dx = pi 4 , because the integral corresponds to the area of a quarter of the disk with radius 1. (c) No, they are not independent because fX,Y (−1/2, 1/2) = 0 6= 1 2 · 1 2 = fX(−1/2)fY (1/2). 8. Let X be a discrete random variable having the pmf P(X = −1) = P(X = 1) = 0.5. (a) Find the mean µ and variance σ2 of X. (b) Derive the moment generating function of X and state the values for which it is defined. (c) Find the skewness and kurtosis of X. Comment on your findings. (d) Find the cumulant generating function of X. (e) Let X1, X2, . . . be a sequence of independent random variables where each one has the same distribution as X. Derive the moment generating function of Sn = X1 +X2 + . . .+Xn√ n . (f) Using the moment generating function of Sn, prove that Sn converges to N(0, σ 2) in distribution as n→∞, where σ2 is the same as that in (a). Hint: you may wish to use ex ≈ 1 + x+ x22 + x 3 6 for small x. (g) Let T = X1 +X2 + . . .+X100. Give an approximate value of P(T < 12). Page 10 of 16 pages MAST20004 Semester 1, 2019 [18 marks] Solution (a) µ = (−1) · 0.5 + 1 · 0.5 = 0, σ2 = E(X2)− µ2 = (−1)2 · 0.5 + 12 · 0.5− 02 = 1. (b) The moment generating function is MX(t) = E(etX) = e(−1)·t · 0.5 + e1·t · 0.5 = 0.5(e−t + et) for all t ∈ R. (c) The skewness of X is κ3 = E[(X − µ)3] = E(X3) = (−1)3 · 0.5 + 13 · 0.5 = 0. The kurtosis can be calculated as κ4 = E[(X − µ)4]− 3σ4 = E(X4)− 3 = (−1)4 · 0.5 + 14 · 0.5− 3 = −2. κ3 = 0 reflects that the distribution is symmetric around its mean while κ4 < 0 shows that the distribution has a flatter and shorter tailed distribution than N(0, 1). (d) The cumulant generating function is KX(t) = lnMX(t) = ln(0.5(e −t + et)). (e) Using independence, we have MSn(t) = E ( etSn ) = E ( n∏ i=1 etXi/ √ n ) = n∏ i=1 E ( etXi/ √ n ) = ( 0.5(e−t/ √ n + et/ √ n) )n , where the last equality is due to the fact that the Xi have the same distribution as that of X. (f) Using the approximation of ex ≈ 1 + x+ x22 + x 3 6 for x close to 0, we have 0.5 [ e−t/ √ n + et/ √ n ] ≈ 1 2 { 1− t√ n + t2 2n − t 3 6n3/2 + 1 + t√ n + t2 2n + t3 6n3/2 } = 1 + t2 2n , hence, for each t ∈ R, MSn(t) ≈ ( 1 + t2 2n )n → e t 2 2 , as n→∞, where the last term is the mgf of N(0, 1). This ensures that Sn converges in distri- bution to N(0, 1). (g) By the CLT in (f), we have P(T < 12) = P ( T − 0 10 < 12− 0 10 ) ≈ P(Z < 1.2) = 0.8849, where Z d = N(0, 1) and the final probability is from the normal distribution table. Page 11 of 16 pages MAST20004 Semester 1, 2019 9. Electricity usage during a summer day in Melbourne can be classified as 1=normal, 2=high, or 3=low. Weather conditions often make this level of usage change according to a Markov chain with the following transition matrix: P = 1 2 3 1 2 3 3 4 1 6 1 12 2 5 1 3 4 15 1 2 2 5 1 10 . (a) Which underlying assumption has been made so that this situation can be modelled as a Markov chain? (b) Suppose on 1 January, the electricity usage is normal, what is the probability that the electricity usage is (i) normal on 3 January (in the same year)? (ii) normal on 4 January (in the same year)? (iii) normal on all days from 2 January to 4 January (in the same year)? Justify your answers where necessary. Hint: you may wish to use the Matlab output in the Appendix to reduce your calculations. (c) Without doing any calculations, explain two methods to obtain the stationary dis- tribution of the Markov chain, one of which uses the Matlab output. Interpret your answer. [9 marks] Solution (a) The underlying assumption is that the electricity usage on the next day is purely determined by the status of the usage on the present day and the weather conditions of the next day. (b) (i) 3 January is 2 days from 1 January, so the probability of normal usage on 3 January is p (2) 11 = 0.6708 from the Matlab output. (ii) 4 January is 3 days from 1 January, so the probability of normal usage on 4 January is p (3) 11 = 0.6463 from the Matlab output. (iii) The probability of normal usage on all 3 consecutive days is p311 = (3/4) 3 = 0.4219. (c) Let pi = (pi1, pi2, pi3) and one method is to solve the system of equations piP = pi, pi1 + pi2 + pi3 = 1, to get the stationary distribution. Since P 50 has all the rows identical, we can regard P 50 = limn→∞ Pn, that is, the Markov chain reached stability at n = 50, we can also read the answers from any row of P 50 as (pi1, pi2, pi3) = (0.6352, 0.2361, 0.1288). The stationary distribution says that in the long run, the electricity usage is normal on 63.52% of summer days, high on 23.61% of summer days, and low on 12.88% of summer days. Page 12 of 16 pages MAST20004 Semester 1, 2019 10. Toss a biased coin with P(Head) = 1/4 and P(Tail) = 3/4 repeatedly. Let X be the number of tosses until two consecutive heads and Y be the number of heads until the first tail. (a) Find the distribution of Y . (b) For i = 0, 1, explain why P(X = n|Y = i) = P(X = n− i− 1) and show that E(X|Y = i) = E(X) + i+ 1. (c) For i ≥ 2, explain why P(X = 2|Y = i) = 1 and find E(X|Y = i). (d) Using E(X) = E[E(X|Y )], derive E(X). [8 marks] Solution (a) By the definition of Geometric distribution with “success”=tail, “failure”=head, we have Y d = G(3/4). (b) {Y = i} = {H · · ·H︸ ︷︷ ︸ i T}, the number of heads is i ≤ 1, so to get two consecutive heads, the i + 1 tosses are wasted and the required number of tosses until two consecutive heads starts again at the (i+ 2)th toss, giving P(X = n|Y = i) = P(X = n− i− 1). Now, E(X|Y = i) = ∞∑ n=i+3 nP(X = n|Y = i) = ∞∑ n=i+3 nP(X = n− i− 1) = ∞∑ n=i+3 (n− i− 1)P(X = n− i− 1) + ∞∑ n=i+3 (i+ 1)P(X = n− i− 1) = E(X) + i+ 1. (c) Again, {Y = i} = {H · · ·H︸ ︷︷ ︸ i T} so the first two tosses resulted in heads and X(H · · ·H︸ ︷︷ ︸ i T ) = 2. Since the conditional distribution (X|Y = i) has pmf 1 at 2, E(X|Y = i) = 2. (d) Using the formula E(X) = E[E(X|Y )], we have E(X) = E[E(X|Y )] = ∞∑ i=0 E(X|Y = i)P(Y = i) = (E(X) + 1) · 3 4 + (E(X) + 2) · 1 4 · 3 4 + ∞∑ i=2 2 ( 1 4 )i · 3 4 = 15 16 E(X) + 5 4 . Solving the equation gives E(X) = 20. Total marks = 110 End of Questions Page 13 of 16 pages MAST20004 Semester 1, 2019 Appendix: Some MATLAB output >> x1=0.01:.01:1.00; x2=1.01:.01:2.00; x3=2.01:.01:3.00; y1=cdf('norm',x1,0,1); y2=cdf('norm',x2,0,1); y3=cdf('norm',x3,0,1); [x1' y1' x2' y2' x3' y3'] ans = 0.0100 0.5040 1.0100 0.8438 2.0100 0.9778 0.0200 0.5080 1.0200 0.8461 2.0200 0.9783 0.0300 0.5120 1.0300 0.8485 2.0300 0.9788 0.0400 0.5160 1.0400 0.8508 2.0400 0.9793 0.0500 0.5199 1.0500 0.8531 2.0500 0.9798 0.0600 0.5239 1.0600 0.8554 2.0600 0.9803 0.0700 0.5279 1.0700 0.8577 2.0700 0.9808 0.0800 0.5319 1.0800 0.8599 2.0800 0.9812 0.0900 0.5359 1.0900 0.8621 2.0900 0.9817 0.1000 0.5398 1.1000 0.8643 2.1000 0.9821 0.1100 0.5438 1.1100 0.8665 2.1100 0.9826 0.1200 0.5478 1.1200 0.8686 2.1200 0.9830 0.1300 0.5517 1.1300 0.8708 2.1300 0.9834 0.1400 0.5557 1.1400 0.8729 2.1400 0.9838 0.1500 0.5596 1.1500 0.8749 2.1500 0.9842 0.1600 0.5636 1.1600 0.8770 2.1600 0.9846 0.1700 0.5675 1.1700 0.8790 2.1700 0.9850 0.1800 0.5714 1.1800 0.8810 2.1800 0.9854 0.1900 0.5753 1.1900 0.8830 2.1900 0.9857 0.2000 0.5793 1.2000 0.8849 2.2000 0.9861 0.2100 0.5832 1.2100 0.8869 2.2100 0.9864 0.2200 0.5871 1.2200 0.8888 2.2200 0.9868 0.2300 0.5910 1.2300 0.8907 2.2300 0.9871 0.2400 0.5948 1.2400 0.8925 2.2400 0.9875 0.2500 0.5987 1.2500 0.8944 2.2500 0.9878 0.2600 0.6026 1.2600 0.8962 2.2600 0.9881 0.2700 0.6064 1.2700 0.8980 2.2700 0.9884 0.2800 0.6103 1.2800 0.8997 2.2800 0.9887 0.2900 0.6141 1.2900 0.9015 2.2900 0.9890 0.3000 0.6179 1.3000 0.9032 2.3000 0.9893 0.3100 0.6217 1.3100 0.9049 2.3100 0.9896 0.3200 0.6255 1.3200 0.9066 2.3200 0.9898 0.3300 0.6293 1.3300 0.9082 2.3300 0.9901 0.3400 0.6331 1.3400 0.9099 2.3400 0.9904 0.3500 0.6368 1.3500 0.9115 2.3500 0.9906 0.3600 0.6406 1.3600 0.9131 2.3600 0.9909 0.3700 0.6443 1.3700 0.9147 2.3700 0.9911 0.3800 0.6480 1.3800 0.9162 2.3800 0.9913 0.3900 0.6517 1.3900 0.9177 2.3900 0.9916 0.4000 0.6554 1.4000 0.9192 2.4000 0.9918 0.4100 0.6591 1.4100 0.9207 2.4100 0.9920 0.4200 0.6628 1.4200 0.9222 2.4200 0.9922 0.4300 0.6664 1.4300 0.9236 2.4300 0.9925 0.4400 0.6700 1.4400 0.9251 2.4400 0.9927 0.4500 0.6736 1.4500 0.9265 2.4500 0.9929 0.4600 0.6772 1.4600 0.9279 2.4600 0.9931 0.4700 0.6808 1.4700 0.9292 2.4700 0.9932 0.4800 0.6844 1.4800 0.9306 2.4800 0.9934 0.4900 0.6879 1.4900 0.9319 2.4900 0.9936 Page 14 of 16 pages MAST20004 Semester 1, 2019 0.5000 0.6915 1.5000 0.9332 2.5000 0.9938 0.5100 0.6950 1.5100 0.9345 2.5100 0.9940 0.5200 0.6985 1.5200 0.9357 2.5200 0.9941 0.5300 0.7019 1.5300 0.9370 2.5300 0.9943 0.5400 0.7054 1.5400 0.9382 2.5400 0.9945 0.5500 0.7088 1.5500 0.9394 2.5500 0.9946 0.5600 0.7123 1.5600 0.9406 2.5600 0.9948 0.5700 0.7157 1.5700 0.9418 2.5700 0.9949 0.5800 0.7190 1.5800 0.9429 2.5800 0.9951 0.5900 0.7224 1.5900 0.9441 2.5900 0.9952 0.6000 0.7257 1.6000 0.9452 2.6000 0.9953 0.6100 0.7291 1.6100 0.9463 2.6100 0.9955 0.6200 0.7324 1.6200 0.9474 2.6200 0.9956 0.6300 0.7357 1.6300 0.9484 2.6300 0.9957 0.6400 0.7389 1.6400 0.9495 2.6400 0.9959 0.6500 0.7422 1.6500 0.9505 2.6500 0.9960 0.6600 0.7454 1.6600 0.9515 2.6600 0.9961 0.6700 0.7486 1.6700 0.9525 2.6700 0.9962 0.6800 0.7517 1.6800 0.9535 2.6800 0.9963 0.6900 0.7549 1.6900 0.9545 2.6900 0.9964 0.7000 0.7580 1.7000 0.9554 2.7000 0.9965 0.7100 0.7611 1.7100 0.9564 2.7100 0.9966 0.7200 0.7642 1.7200 0.9573 2.7200 0.9967 0.7300 0.7673 1.7300 0.9582 2.7300 0.9968 0.7400 0.7704 1.7400 0.9591 2.7400 0.9969 0.7500 0.7734 1.7500 0.9599 2.7500 0.9970 0.7600 0.7764 1.7600 0.9608 2.7600 0.9971 0.7700 0.7794 1.7700 0.9616 2.7700 0.9972 0.7800 0.7823 1.7800 0.9625 2.7800 0.9973 0.7900 0.7852 1.7900 0.9633 2.7900 0.9974 0.8000 0.7881 1.8000 0.9641 2.8000 0.9974 0.8100 0.7910 1.8100 0.9649 2.8100 0.9975 0.8200 0.7939 1.8200 0.9656 2.8200 0.9976 0.8300 0.7967 1.8300 0.9664 2.8300 0.9977 0.8400 0.7995 1.8400 0.9671 2.8400 0.9977 0.8500 0.8023 1.8500 0.9678 2.8500 0.9978 0.8600 0.8051 1.8600 0.9686 2.8600 0.9979 0.8700 0.8078 1.8700 0.9693 2.8700 0.9979 0.8800 0.8106 1.8800 0.9699 2.8800 0.9980 0.8900 0.8133 1.8900 0.9706 2.8900 0.9981 0.9000 0.8159 1.9000 0.9713 2.9000 0.9981 0.9100 0.8186 1.9100 0.9719 2.9100 0.9982 0.9200 0.8212 1.9200 0.9726 2.9200 0.9982 0.9300 0.8238 1.9300 0.9732 2.9300 0.9983 0.9400 0.8264 1.9400 0.9738 2.9400 0.9984 0.9500 0.8289 1.9500 0.9744 2.9500 0.9984 0.9600 0.8315 1.9600 0.9750 2.9600 0.9985 0.9700 0.8340 1.9700 0.9756 2.9700 0.9985 0.9800 0.8365 1.9800 0.9761 2.9800 0.9986 0.9900 0.8389 1.9900 0.9767 2.9900 0.9986 1.0000 0.8413 2.0000 0.9772 3.0000 0.9987 Page 15 of 16 pages MAST20004 Semester 1, 2019 >> y=[3/4 1/6 1/12; 2/5 1/3 4/15; 1/2 2/5 1/10]; y2=y^2; y3=y^3; y4=y^4; y5=y^5; y20=y^(20); y50=y^(50) ans y = 0.7500 0.1667 0.0833 0.4000 0.3333 0.2667 0.5000 0.4000 0.1000 y2 = 0.6708 0.2139 0.1153 0.5667 0.2844 0.1489 0.5850 0.2567 0.1583 y3 = 0.6463 0.2292 0.1245 0.6132 0.2488 0.1380 0.6206 0.2464 0.1330 y4 = 0.6387 0.2339 0.1274 0.6284 0.2403 0.1312 0.6305 0.2388 0.1307 y5 = 0.6363 0.2354 0.1283 0.6331 0.2373 0.1296 0.6337 0.2370 0.1293 y20 = 0.6352 0.2361 0.1288 0.6352 0.2361 0.1288 0.6352 0.2361 0.1288 y50 = 0.6352 0.2361 0.1288 0.6352 0.2361 0.1288 0.6352 0.2361 0.1288 Page 16 of 16 pages
学霸联盟