[20 marks] Let xi, . . . , Xn be a random sample drawn independently from a one-parameter curved normal distribution which has density -oo 〈 x 〈 oo, θ > 0, 2πθ nx, and r2 - enote T-1 Tn (d) [3...
2. 20 marks] Let z1,., xn be a random sample drawn independently from a one-parameter curved normal distribution which has density -oo < x < 00, θ>0, , riid i.e., X r, and 2,2-1 Γη (e) 3 marks Find the Fisher information Z(0) (f) [3 marks] Is θ2 an MVUE of θ? Justify your answer (g) 3 marks] Assume that T = 1.32 and x-3.76 for a random sample of size n = 100. Find the Wald 95% confidence interval...
Exercice 6. Let be (Xi,..., Xn) an iid sample from the Bernoulli distribution with parameter θ, ie. I. What is the Maximum Likelihood estimate θ of θ? 2. Show that the maximum likelihood estimator of θ is unbiased. 3. We're looking to cstimate the variance θ (1-9) of Xi . x being the empirical average 2(1-2). Check that T is not unli ator propose an unbiased estimator of θ(1-0).
3. [20 marks] Consider the multinomial distribution with 3 categories, where the random variables Xi, X2 and X3 have the joint probability function where x = (zi, 2 2:23), θ = (θί, θ2), n = x1 + 2 2 + x3, θι, θ2 > 0 and 1-0,-26, > 0. (a) [4 marks] Find the maximum likelihood estimator θ of θ. (b) [4 marks] Find that the Fisher information matrix I(0) (c) [4 marks] Show that θ is an MVUE. (d)...
3. [20 marks] Consider the multinomial distribution with 3 categories, where the random variables Xi, X2 and X3 have the joint probability function where x = (zi, 2 2:23), θ = (θί, θ2), n = x1 + 2 2 + x3, θι, θ2 > 0 and 1-0,-26, > 0. (a) [4 marks] Find the maximum likelihood estimator θ of θ. (b) [4 marks] Find that the Fisher information matrix I(0) (c) [4 marks] Show that θ is an MVUE. (d)...
A (3 pt) Let Xi, ,X, are drawn from the distribution ftheta(z) = F 404 (r+0) , for 0 < x < oo and 0 < θ < oo. We define Y = 3X an estimator for θ. Verify whether this estimator is unbiased? Find the MSE of Y. Hint: E(x)E(X B (3 pt) Let X,.., X, are drawn from the distribution fo) for O < x < 00 and 0 < θ < oo. We define Y = 2X...
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...
Problem 4 Define f(x) as follows θ2 -1<=x<0 1-θ2 0<=x>1 0 otherwise Let X1, … Xn be iid random variables with density f for some unknown θ (0,1), Let a be the number of Xi which are negatives and b be the number of Xi which are positive. Total number of samples n = a+b. Find he Maximum likelihood estimator of θ? Is it asymptotically normal in this sample? Find the asymptotic variance Consider the following hypotheses: H0: X is...
4. Let X1, . . . , Xn be a random sample from a normal random variable X with probability density function f(x; θ) = (1/2θ 3 )x 2 e −x/θ , 0 < x < ∞, 0 < θ < ∞. (a) Find the likelihood function, L(θ), and the log-likelihood function, `(θ). (b) Find the maximum likelihood estimator of θ, ˆθ. (c) Is ˆθ unbiased? (d) What is the distribution of X? Find the moment estimator of θ, ˜θ.
1. Let Xi,..., Xn be a random sample from a distribution with p.d.f. f(x:0)-829-1 , 0 < x < 1. where θ > 0. (a) Find a sufficient statistic Y for θ. (b) Show that the maximum likelihood estimator θ is a function of Y. (c) Determine the Rao-Cramér lower bound for the variance of unbiased estimators 12) Of θ
20. Let Xi, X2, function Xn be a random sample from a population X with density C")pr(1-0)rn-r for x = 0, 1.2, , m f(x:0) = 0 otherwise, , where 0 〈 θく1 is parameter. Show that unbiased estimator of θ for a fixed m. is a uniform minimum variance 20. Let Xi, X2, function Xn be a random sample from a population X with density C")pr(1-0)rn-r for x = 0, 1.2, , m f(x:0) = 0 otherwise, , where...