P(8), θ eQa(0 4.6 Let X, ..., X, be independent r.v.'s distributed as estimate δ(x ,...
1.5 If Xi, ..., X, are independent r.v.'s distributed as B (k, θ), θ e Ω2(0.1), with respective observed values x, , xu, show that k is the MLE of θ, where x is the sample mean of the x's.
5.7 Let X, X, be independent r.v.'s from the u(e -a, o+ b) distribution, where a and b are (known) positive constants and θ Ω M. Determine the moment estimate θ of θ, and compute its expectation and variance.
3.18 Let the r.v. X has the Geometric p.d.f. (i) Show that X is both sufficient and complete. U(X )-1 (ii) Show that the estimate U defined by: estimate of 6 if X-1, and U(X) -0 if X 2 2, is an unbiased (iii) Conclude that U is the UNU estimate of θ and also an entirely unreasonable estimate.
5. Let X be uniformly distributed in [0, 1]. Given X = x, the r.v. Y is uniformly distributed in 0, x for 0
5. Let X be uniformly distributed in [0, 1]. Given X = x, the r.v. Y is uniformly distributed in 0, x for 0<x<1 (a) Specify the joint pdf fxy(x,y) and sketch its region of support Ω XY. (b) Determine fxly(x1025). (c) Determine the probability P(X〈2Y). (d) Determine the probability P(X +Y 1)
3.13 Let X,..., X be i.i.d. r.v.'s from the Gamma distribution with parameters a known and β θ eQ (0,0) unknown. (i) Determine the Fisher information I(e). U = U (X, , ,X" ) = ' (ii) Show that the estimate ηα 1.1 is unbiased and calculate its variance.
MA2500/18 8. Let X be a random variable and let 'f(r; θ) be its PDF where θ is an unknown scalar parameter. We wish to test the simple null hypothesis Ho: 0 against the simple alternative Hi : θ-64. (a) Define the simple likelihood ratio test (SLRT) of Ho against H (b) Show that the SLRT is a most powerful test of Ho against H. (c) Let Xi, X2.... , X be a random sample of observations from the Poisson(e)...
this is a challenging question Let X ~ POI(μ), and let θ-P(X = 0-e-". (a) Is -e-r an unbiased estimator of θ? (b) Show that θ = u(X) is an unbiased estimator of θ, where u(0) 1 and u(x)-0 if (c) Compare the MSEs of, and è for estimating θ-e-, when μ 1 and 2. Let X ~ POI(μ), and let θ-P(X = 0-e-". (a) Is -e-r an unbiased estimator of θ? (b) Show that θ = u(X) is an...
4. Let X and Y be independent standard normal random variables. The pair (X,Y) can be described in polar coordinates in terms of random variables R 2 0 and 0 e [0,27], so that X = R cos θ, Y = R sin θ. (a) (10 points) Show that θ is uniformly distributed in [0,2 and that R and 0 are independent. (b) (IO points) Show that R2 has an exponential distribution with parameter 1/2. , that R has the...
5. Let Xi, . . . , Xn be a random sample from f(x:0) = -| for z > 0. (a) Assume that θ 0.2 Using the Inversion Method of Sampling, write a R function to generate data from f(x; 0). (b) Use your function in (a) to draw a sample of size 100 from f(0 0.2 (c) Find the method of moments estimate of θ using the data in (b). (d) Find the maximum likelihood estimate of θ using...