3. Let {x1, x2,...,xn} be a list of numbers and let ¯ x denote the average of the list. Let a and b
be two constants, and for each i such that 1 ≤ i ≤ n, let yi = axi + b. Consider the new list
{y1,y2,...,yn}, and let the average of this list be ¯ y. Prove a formula for ¯ y in terms of a, b, and ¯ x.
4. Let n be a positive integer. Consider the list of even numbers {2,4,6,...,2n}. What is the
average of this list? Prove your answer.
3. Let {x1, x2,...,xn} be a list of numbers and let ¯ x denote the average...
5. Let {xn} and {yn} be sequences of real numbers such that x1 = 2 and y1 = 8 and for n = 1,2,3,··· x2nyn + xnyn2 x2n + yn2 xn+1 = x2 + y2 and yn+1 = x + y . nn nn (a) Prove that xn+1 − yn+1 = −(x3n − yn3 )(xn − yn) for all positive integers n. (xn +yn)(x2n +yn2) (b) Show that 0 < xn ≤ yn for all positive integers n. Hence, prove...
4. Let X1, X2, . .. be independent random variables satisfying E(X) E(Xn) --fi. (a) Show that Y, = Xn - E(Xn) are independent and E(Yn) = 0, E(Y2) (b) Show that for Y, = (Y1 + . . + Y,)/n, <B for some finite B > 0 and VB,E(Y) < 16B. 16B 6B 1 E(Y) E(Y) n4 i1 n4 n3 (c) Show that P(Y, > e) < 0 and conclude Y, ->0 almost surely (d) Show that (i1 +...
Consider a random sample (X1, Y1),(X2, Y2), . . . ,(Xn, Yn) where Y | X = x is modeled by a N(β0 + βx, σ2 ) distribution, where β0, β1 and σ 2 are unknown. (a) Prove that the mle of β1 is an unbiased estimator of β1. (b) Prove that the mle of β0 is an unbiased estimator of β0.
Let X1, X2, ..., Xn be independent Exp(2) distributed random vari- ables, and set Y1 = X(1), and Yk = X(k) – X(k-1), 2<k<n. Find the joint pdf of Yı,Y2, ...,Yn. Hint: Note that (Y1,Y2, ...,Yn) = g(X(1), X(2), ..., X(n)), where g is invertible and differentiable. Use the change of variable formula to derive the joint pdf of Y1, Y2, ...,Yn.
Let X1,X2,X3..Xn be iid of f(x)= theta. x^(theta-1), with x(0,1) and theta being a positive number. Is the parameter identifiable?.Compute the maximum likelihood estimate. If instead of X1,X2,,, We observe, Y1,Y2,...Yn, where Yi=1(Xi<=0.5).What distribution does Yi follow? What is the parameter of this distribution? Compute MLE and the method of moments and Fisher information.
Let X1,X2,X3..Xn be iid of f(x)= theta. x^(theta-1), with x(0,1) and theta being a positive number. Is the parameter identifiable?.Compute the maximum likelihood estimate. If instead of X1,X2,,, We observe, Y1,Y2,...Yn, where Yi=1(Xi<=0.5).What distribution does Yi follow? What is the parameter of this distribution? Compute MLE and the method of moments and Fisher information.
(a) Show that the points (x1, yı), (X2, y2), ..., (xn, yn) are collinear in R2 if and only if 1 X1 yi X2 Y3 rank < 2 1 Xn yn ] (b) What is the generalization of part (a) to points (x1, Yı, zı), (x2, y2, 22), ...,(Xn, Yn, zn) in R'. Explain.
Please answer all the parts neatly with all details. 4. Let X1, X2,... be independent random variables satisfying E(X4) < B for some finite B > 0 and E(Xn)-> . (a) Show that Y = Xn - E(Xn) are independent and E(Yn) = 0, E(Y2) < B, E(Y4 (b) Show that for Y, = (Yi +..+ Y)/n, 16B 16B ΣειβΥ< 6B 1 = n4 i=1 6 + n4 ij ΣΕ Υ) E(Y4) n'3 n2 P(Y > €) < oo and...
In 10. 11, Let X1, X2, . , Xn and Yi, Y2, . . . , Y,, be independent samples from N(μ, σ?) and N(μ, σ), respectively, where μ, σ. ơỈ are unknown. Let ρ-r/of and g m/n, and consider the problem of unbiased estimation of u In 10. 11, Let X1, X2, . , Xn and Yi, Y2, . . . , Y,, be independent samples from N(μ, σ?) and N(μ, σ), respectively, where μ, σ. ơỈ are unknown....
(d) Show that if and are distinct eigenvalues of a square matrix A, x = (x1; x2; : : : ; xn) 2 E[A], y = (y1; y2; : : : ; yn) 2 E[A] then: x; y = x1y1 + x2y2 + + xnyn = 0: