Since the PDF is
so
and
Hence
a) Hence
,
b) Since
and
so
for all
Hence
and
Now using properties of expectation and variance we have
and
Let Xi,.,Xn be independent random variables with common probability density f(x) = ה sin(x) , x...
Let X- (Xi, X2,X3) be an absolutely continuous random vector with the joint probability density function elsewhere. Calculate 1. the probability of the event A -(Xs 3. the probability density function xx (,s) of the (XX)-marginal 4. the probability density function fx, () of the Xi-marginal, and the probability density function fx (r3) of the X3-marginal 5. Are Xi and X independent random variables? 6. E(Xi) and Var(X) 8. the covariance cov(Xi, X3) of Xi and X,3 9. Which elements...
(7) Let X1,Xn are i.i.d. random variables, each with probability distribution F and prob- ability density function f. Define U=max{Xi , . . . , X,.), V=min(X1, ,X,). (a) Find the distribution function and the density function of U and of V (b) Show that the joint density function of U and V is fe,y(u, u)= n(n-1)/(u)/(v)[F(v)-F(u)]n-1, ifu < u.
(7) Let X1,Xn are i.i.d. random variables, each with probability distribution F and prob- ability density function f. Define U=max{Xi...
5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J, 1,,-1, , n. OV&.for any two random variables X and Y) or each 1, and (11 CoV(X,Y) var(x)var(y) (Recall that p vararo
5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J,...
2. Let X1, X2,. . , Xn denote independent and identically distributed random variables with variance σ2, which of the following is sufficient to conclude that the estimator T f(Xi, , Xn) of a parameter 6 is consistent (fully justify your answer): (a) Var(T) (b) E(T) (n-1) and Var(T) (c) E(T) 6. (d) E(T) θ and Var(T)-g2. 72 121
4. Let Xi, X2,... be uncorrelated random variables, such that Xn has a uniform distribution over -1/n, 1/n]. Does the sequence converge in probability? 5. Let Xi,X2 be independent random variables, such that P(X) PX--) Does the sequence X1 +X2+...+X satisfy the WLLN? Converge in probability to 0?
2. Let Xi, X,.., Xn denote a random sample from the probability density function Show that X(i) = min(X1,X2, . . . , Xn} is sufficient for ?. Hint: use an indictorfunction since the support depends on ?
Let Xi, x,, ,X, be independent random variables with mean and variance σ . Let Y1-Y2, , Y, be independent random variables with mhean μ and variance a) Compute the expected value of W b) For what value of a is the variance of W a minimum? σ: Let W-aX + (1-a) Y, where 0 < a < 1.
Let Xi, x,, ,X, be independent random variables with mean and variance σ . Let Y1-Y2, , Y, be independent random...
- Let {Xn} denote a sequence of iid random variables such that P(Xi = 1) = P(X1 = -1) = 1/2. Let Sn = X1 + X2 + ... + xn. (a) Find ES, and var(Sn); (b) Show that Sn is a martingale.
Let XI, X2, , Xn İs a random sample from the probability density function Use factorization theorem to show that X(1) = min(X1 , . . . , Xn) is sufficient for θ Is X(1) minimal sufficient for θ? a. b.
1.9 Let Xi, -.. .Xn be nonnegative integer-valued random variables with identical pffx (-). A discrete mixture distribution W is created with pf fw (x)-puxi(x) +..+pfx, (x), where pi0 for i -1,... .n and X\-iPi1. Another random variable Y is defined by Y - (a) Compare the mean of W and Y. (b) If Xi,.. ,Xn are independent, compare the variance of W and Y.