1. Let X1, X2 be i.i.d with this distribution: f(x) = 3e cx, x ≥ 0 a. Find the value of c b. Recognize this as a famous distribution that we’ve learned in class. Using your knowledge of this distribution, find the t such that P(X1 > t) = 0.98. c. Let M = max(X1, X2). Find P(M < 10)
1. Let X1, X2 be i.i.d with this distribution: f(x) = 3e cx, x ≥ 0...
7. Let X1, X2,.. be i.i.d. random variables, and let T(t)minn: X > t, t20. (a) Determine the distribution of T(t) (b) Show that, if p= P(X1> t)0 astoo, then pT(t)Exp(1) as to 7. Let X1, X2,.. be i.i.d. random variables, and let T(t)minn: X > t, t20. (a) Determine the distribution of T(t) (b) Show that, if p= P(X1> t)0 astoo, then pT(t)Exp(1) as to
Problem 2. (The Convergence of Extreme Value) Let X1, X2, ... be i.i.d sample from the distribution with density function as: f(x) = >1 10 otherwise Define Mn = min(X1, X2, ... , Xn), answer the following questions. 1) Show that Mn P 1 as n +0. 2) Show that n(Mn – 1) converges in distribution as n + 00. Find out the limit distri- bution.
Let X1,...X be i.i.d with density f()(1/0)exp(-/0) for r >0 and 0> 0. a. Find the pitman estimator of 0 b. Show that the pitman estimator has smaller risk than the UMVUE of when the loss function is (t-0)2 02 Suppose C. f(x)= 0exp(-0x) and that 0 has a gamma prior with parameters a and p, find the Bayes estimator of 0 d. Find the minimum Bayes risk e. Find the minimax estimator of 0 if one exists. 1 Let...
(7) Let X1,Xn are i.i.d. random variables, each with probability distribution F and prob- ability density function f. Define U=max{Xi , . . . , X,.), V=min(X1, ,X,). (a) Find the distribution function and the density function of U and of V (b) Show that the joint density function of U and V is fe,y(u, u)= n(n-1)/(u)/(v)[F(v)-F(u)]n-1, ifu < u. (7) Let X1,Xn are i.i.d. random variables, each with probability distribution F and prob- ability density function f. Define U=max{Xi...
Let X1,..., X, be an i.i.d. sample from a Rayleigh distribution with parameter e > 0: f(x\C) = e ==/(20?), x20 (This is an alternative parametrization of that of Example A in Section 3.6.2.) a. Find the method of moments estimate of e. b. Find the mle of C. Find the asymptotic variance of the mle.
5. Let X1, X2,... , X100 be i.i.d. random variables, following the normal distribution N(0, 102). Let α denote the probability that there are at least 3 variables among them whose absolute value is larger than 19.6. Compute α, and give an approxi- mate value of α with an error less than 0.01 according to the Poisson distribution. 15pts] 5. Let X1, X2,... , X100 be i.i.d. random variables, following the normal distribution N(0, 102). Let α denote the probability...
Let X1, X2, · · · Xn be a i.i.d. sample from Bernoulli(p) and let . Show that Yn converges to a degenerate distribution at 0 as n → ∞.
Let X1, X2,..., Xn be a r.s. from f(x) = 0x0-1, for 0 < x <1,0 < a < 0o. (a) Find the MLE of 0. (b) Let T = -log X. Find the pdf of T. (c) Find the pdf of Y = DIT: (i.e., distribution of Y = - , log Xi). (d) Find E(). (e) Find E( ). (f) Show that the variance of 0 MLE → as n → 00. (g) Find the MME of 0.
5. Let X1, ..., X 100 be i.i.d. random variables with the probability distribution function f(x;0) = 0(1 - 0)", r=0,1,2..., 0<o<1 Construct the uniformly most powerful test for H, :0= 1/2 vs HA: 0 <1/2 at the significance level a =0.01. Which theorems are you using? Hint: EX = 1, VarX = 10.
Let X1....Xn be i.i.d sample with a continous distribution function F(.) and X(1)<......<X(n) are the orser-statistics of the sample. Let the constant Mp be defined by F(Mp)=p. Show that for 1≤k1≤k2≤ n, P{X(k1) ≤Mp ≤X(k2)}=P{k1 ≤Bionmial(n,p) k2}