Suppose X ~ Exp(); f(X/
)
=
e^(-
x)
, x > 0 and we have n=1
Find an unbiased estimate of 1/
Find an unbiased estimate of 1/(^2)
Let X1,...X be i.i.d with density f()(1/0)exp(-/0) for r >0 and 0> 0. a. Find the pitman estimator of 0 b. Show that the pitman estimator has smaller risk than the UMVUE of when the loss function is (t-0)2 02 Suppose C. f(x)= 0exp(-0x) and that 0 has a gamma prior with parameters a and p, find the Bayes estimator of 0 d. Find the minimum Bayes risk e. Find the minimax estimator of 0 if one exists. 1
Let...
1. (Easy) Let X-N(41, 0-?). Suppose we are interested in estimating the cube of the population mean (?). Consider the biased estimate =Éx?. Calculate the expectation of this estimator and propose an unbiased estimator of M?. Hint: The skewness of X is given by: E(X-u)' = E(X*) - 3uE(X2) +3u?E(X) - u'. lal
1. Suppose that N is finite and suppose that we have a probability mass function f on N. For this problem assume that for all w EN we have f(w) > 0. Consider the vector space 12(12) consisting of all functions 6:1 + R, and also equip the vector space with the inner-product (9, 4) = $(w)*(w)f(w). WEN Suppose that we have a function X : 2 + R. Let X = {x ER : JW EN, s.t. X(w) =...
6. Suppose X and Y have the joint pdf fr,y) = 2 exp(-:- 0 ) 0< <y otherwise o a. Find Px.x, the correlation coefficient between X and Y. b. Let U = 2X-1 and V=Y +2. What is pu.v, the correlation coefficient between U and V? c. Repeat (b) if U = -TX and V = Y + In 2. d. Let W = Y - X. Compute Var (W). e. Refer to (d). Find an interval that will...
Suppose we want to estimate a parameter θ of a certain distribution and we have the following independent point estimates N(0+0.1,0.01) N(0, 0.04) B2 ~ a) What are the mean square errors for these point estimates? (4pts) b) Find a point estimate with mean square error less than or equal to 0.01. (2pts) c) Only use ël and Ộ2, find the unbiased estimator with the smallest variance possible. What is that estimator? What is the smallest variance? (6pts)
Suppose we...
Suppose X1, X2, . . . , Xn (n ≥ 5) are i.i.d. Exp(µ) with the density f(x) = 1 µ e −x/µ for x > 0. (a) Let ˆµ1 = X. Show ˆµ1 is a minimum variance unbiased estimator. (b) Let ˆµ2 = (X1 +X2)/2. Show ˆµ2 is unbiased. Calculate V ar(ˆµ2). Confirm V ar(ˆµ1) < V ar(ˆµ2). Calculate the efficiency of ˆµ2 relative to ˆµ1. (c) Show X is consistent and sufficient. (d) Show ˆµ2 is not consistent...
Let X and Y be independent random variables with X = N(0, 1) and Y = Exp(1). Find E( |X| (Y + 1)^2 ).
We have a function F : {0, . . . , n − 1} → {0, . . . , m − 1}. We know that, for 0 ≤ x, y ≤ n − 1, F((x + y) mod n) = (F(x) + F(y)) mod m. The only way we have for evaluating F is to use a lookup table that stores the values of F. Unfortunately, an Evil Adversary has changed the value of 1/5 of the table entries...
2-t 2. Suppose that X - Exp(2), for some i >0. We know that the moment generating function of X is given by M(t)= E[e"]=-4, for some appropriate set of values of t. (a) Derive this mgf result. Explain what condition ont is necessary for this expression to be valid, and why this condition is necessary. (b) Use the mgf to find the first four moments ( u u , and ) of X. (c) Use your results in part...
| Prove that for n e N, n > 0, we have 1 x 1!+ 2 x 2!+... tnx n! = (n + 1)! - 1.