TOPIC:Use of the Chebyshev's inequality to find the required estimate of the given probability.
Let X1.....X10 be independent and identically distributed, each with expectation 0 and variance 0.4 Write S=...
3. Suppose X1, X2, -- are independent identically distributed random variables with mean 0 and variance 1.Let Sn denote the partial sum Let Fn denote the information contained in Xi, .X,. Suppoe m n. (1) Compute El(Sn Sm)lFm (2) Compute ESm(Sn Sm)|F (3) Compute ES|]. (Hint: Write S (4) Verify that S -n is a martingale. [Sm(Sn Sm))2)
3. Suppose X1, X2, -- are independent identically distributed random variables with mean 0 and variance 1.Let Sn denote the partial sum...
3. Suppose X1,X2, are independent identically distributed random variables with mean μ and variance σ2. Let So = 0 and for n > 0 let Sn denote the partial sumi Let Fn denote the information contained in X1, ,Xn. (1) Verify that Sn nu is a martingale. (2) Assume that μ 0, verify that Sn-nơ2 is a martingale.
3. Suppose X1,X2, are independent identically distributed random variables with mean μ and variance σ2. Let So = 0 and for n...
2. (15pts) Let X1, X2 be independent and identically distributed with Uniform(0,) density. (a) Is Y-X1 + X2 a sufficient statistic for θ? Hint: You need to find the conditional density of (X1, X2) given Y = X1 + X2. (b) Consider now S := max(X1, X2). 1s S a sufficient statistics for θ?
13. Let X1, X2, ...,Xy be a sequence of independent and identically distributed discrete random variables, each with probability mass function P(X = k)=,, for k = 0,1,2,3,.... emak (a) Find the expected value and the variance of the sample mean as = N&i=1X,. (b) Find the probability mass function of X. (c) Find an approximate pdf of X when N is very large (N −0).
Let X1,..., X., be a independent identically distributed sample from the DISCRETE uniform distribution with probability mass function 1/0 P(X = x) for a € {1,2,...,0} otherwise i.e. O is an unknown parameter and the probability that X takes any integer between 1 and is constant. Suppose we use @ = 2X - 1 to estimate 0. Find the bias and mean-squared error (MSE) of O. (Note: Di-1 32 = 0(0+1)(20 + 1)/6)
3. Let {X1, X2, X3, X4} be independent, identically distributed random variables with p.d.f. f(0) = 2. o if 0<x< 1 else Find EY] where Y = min{X1, X2, X3, X4}.
Consider n independent and identically distributed random variables X1,X2, following a uniform distribution on the interval [0,1] ,Xn, each a) What is the pdf of Mmin(X1,X2, .. ,Xn)? b) Give the expectation and variance of XX 1-1лі.
2. Let X1, X2,. . , Xn denote independent and identically distributed random variables with variance σ2, which of the following is sufficient to conclude that the estimator T f(Xi, , Xn) of a parameter 6 is consistent (fully justify your answer): (a) Var(T) (b) E(T) (n-1) and Var(T) (c) E(T) 6. (d) E(T) θ and Var(T)-g2. 72 121
Let X1 + X2 +...+ X30 be independent and identically distributed exponential random variables with mean 1. Calculate the probability that X ¯ is greater than 1.1. a. 29% b. 71% c. 35%
If X1 and X2 are independent and identically distributed normal random variables with mean m and variance s2, find the probability distribution function for U=X1-3X2/2.