Suppose that X1,... , X are i.i.d. with an exponential distribution with mean equal to 2.0....
Suppose that X1, X, ..., Xn are a sequence of i.i.d. Exponential() variables. Use the delta method to approximate the distribution of 1/X.
8.60-Modified: Let X1,...,Xn be i.i.d. from an exponential distribution with the density function a. Check the assumptions, and find the Fisher information I(T) b. Find CRLB c. Find sufficient statistic for τ. d. Show that t = X1 is unbiased, and use Rao-Blackwellization to construct MVUE for τ. e. Find the MLE of r. f. What is the exact sampling distribution of the MLE? g. Use the central limit theorem to find a normal approximation to the sampling distribution h....
Suppose X1, X2, . . . , Xn are i.i.d. Exp(µ) with the density f(x) = for x>0 (a) Use method of moments to find estimators for µ and µ^2 . (b) What is the log likelihood as a function of µ after observing X1 = x1, . . . , Xn = xn? (c) Find the MLEs for µ and µ^2 . Are they the same as those you find in part (a)? (d) According to the Central Limit...
Suppose X1, ?2, ... , ?? are i.i.d. exponential random variables with mean ?. a. Find the Fisher information ?(?) b. Find CRLB. c. Find sufficient statistic for ?. d. Show that ?̂ = ?1 is unbiased, and use Rao − Blackwellization to construct MVUE for ?.
iid 20. Let X1, ...,Xn - Exp(a), the exponential distribution with failure rate 2. We showed in Sections 7.2 and 7.3 that â= 1/X is both the MME and the MLE of 2, and that its asymp- totic distribution is given by vn (Å - 1) PW~N (0,22) (8.53) Use the normal distribution in (8.53) to obtain, via a variance stabilizing transformation, an approximate 100(1 – a)% confidence interval for a.
Problem 5.1 (Relation between Gaussian and exponential) Suppose that Xi and X, are i.i.d. N(0,1) (a) Show that Z-X1 + X is exponential with mean 2. b) True or False: Z is independent of Θ-tan ( -i Hint: Use the results from Example 5.4.3, which tells us the joint distribution of V and Θ.
Let X1....Xn be i.i.d sample with a continous distribution function F(.) and X(1)<......<X(n) are the orser-statistics of the sample. Let the constant Mp be defined by F(Mp)=p. Show that for 1≤k1≤k2≤ n, P{X(k1) ≤Mp ≤X(k2)}=P{k1 ≤Bionmial(n,p) k2}
(7) Let X1,Xn are i.i.d. random variables, each with probability distribution F and prob- ability density function f. Define U=max{Xi , . . . , X,.), V=min(X1, ,X,). (a) Find the distribution function and the density function of U and of V (b) Show that the joint density function of U and V is fe,y(u, u)= n(n-1)/(u)/(v)[F(v)-F(u)]n-1, ifu < u.
(7) Let X1,Xn are i.i.d. random variables, each with probability distribution F and prob- ability density function f. Define U=max{Xi...
1. Suppose that X1, X2,..., X, is a random sample from an Exponential distribution with the following pdf f(x) = 6, x>0. Let X (1) = min{X1, X2, ... , Xn}. Consider the following two estimators for 0: 0 =nX) and 6, =Ỹ. (a) Show that ő, is an unbiased estimator of 0. (b) Find the relative efficiency of ô, to ô2.
3.10 (i) If X1, , Xn are i.i.d. according to the exponential density e-", r >0, show that (2.9.3) P [X(n)-log n < y]- e-e-v, -00 < y < oo. (ii) Show that the right side of (2.9.3) is a cumulative distribution function. (The distribution with this edf is called the ertreme value distribution.) (iii) Graph the cdf of X(n)-log n for n = 1, 2, 5 together with the mit e-e" (iv) Graph the densities corresponding to the cdf's...