Invariance Property of MLE:
If T is the MLE of p, then g(T) is the mle of g(p)
where g(p) is any function of p
1. Suppose Yi,½, , Yn is an iid sample from a Bernoulli(p) population distribution, where 0< p<1 is unknown. The population pmf is py(ulp) otherwise 0, (a) Prove that Y is the maximum likelihoo...
Let Yı, Y2, ...,Yn be an iid sample from a population distribution described by the pdf fy(y|0) = (@+ 1) yº, o<y<1 for 0> - -1. (a) Find the MOM estimator of 0. (b) Find the maximum likelihood estimator (MLE) of 0. (c) Find the MLE of the population mean E(Y) = 0 +1 0 + 2 You do not need to prove that the above is true. Just find its MLE.
Suppose Y1, Y2, ..., Yn is an iid sample from a Pareto population distribution described by the pdf fy(y|0) = 4ao y -0-1 y > 20, 2 where the parameter do is known. The unknown parameter is 0 > 0. (a) Find the MOM estimator of 0. (b) Find the MLE of 0.
Let Sı, S2,... , Sn be a random sample from a Bernoulli distribution with pmf (a) Calculate E(S) and find the MOM estimator of p. (b) Construct the log-likelihood function and use this to find the max- imum likeligood (ML) estimator of p. Let Sı, S2,... , Sn be a random sample from a Bernoulli distribution with pmf (a) Calculate E(S) and find the MOM estimator of p. (b) Construct the log-likelihood function and use this to find the max-...
Suppose that X1, X2,., Xn is an iid sample from the probability mass function (pmf) given by (1 - 0)0r, 0,1,2, 0, otherwise, where 001 (a) Find the maximum likelihood estimator of θ. (b) Find the Cramer-Rao Lower Bound (CRLB) on the variance of unbiased estimators of Eo(X). Can this lower bound be attained? (c) Find the method of moments estimator of θ. (d) Put a beta(2,3) prior distribution on θ. Find the posterior mean. Treating this as a fre-...
Suppose Y1, Y2, ..., Yn are such that Y; ~ Bernoulli(p) and let X = 2h+Yi. (a) [1 point] Use the distribution of X to show that the method of moments estimator of p is ÔMM = Lizzi. (Work that is unclear or that cannot be followed from step to step will not recieve full credit.) (b) [2 points] Show that the method of moments estimator PMM is a consistent estimator of p. Please show your work to support your...
7.5.12 Suppose that X.., X, are iid Bernoulli(p) where 0<p s an unknown parameter. Consider the parametric function T(p)-p + qe with q p. (i) Find a suitable unbiased estimator T for (p); (ii) Since the complete sufficient statistic is = Ση!Xi, use the Lehmann-Scheffé theorems and evaluate the conditional expec tation, E [I, I u-11]; (iii) Hence, derive the UMVUE for T(p) Hint: Try and use the mgf of the Xs appropriately.) 7.5.12 Suppose that X.., X, are iid...
2. Suppose Y1,...,Yn are IID discrete random variables with P(Y; = 0) = 60 P(Y; = 1) = 01, P(Y; = 2) = 62, where the parameter vector 6 = (60,61,62) satisfies: 0; > 0 and 200; = 1. (a) Calculate E[Y] and EY?), and use the results to derive a method of moments estimator for the parameters (01,02). (b) Show that the maximum likelihood estimator for 6 = 60, 61, 62) is - Ôno = ôz = = 1(Y;=0),...
B1. A random sample of n observations, Yi, ., Yn, is selected from a pop- ulation in which Yi, for i = 1, 2, ,n, possesses a common distribution the same as that of the population distribution Y (a) Suppose that we know Y has a Geometric distribution with parameter p, p unknown. Find the estimator using the method of moments. (b) Suppose that we know that Y has an exponential distribution with parameter λ, λ unknown. Find the estimator...
Exercice 6. Let be (Xi,..., Xn) an iid sample from the Bernoulli distribution with parameter θ, ie. I. What is the Maximum Likelihood estimate θ of θ? 2. Show that the maximum likelihood estimator of θ is unbiased. 3. We're looking to cstimate the variance θ (1-9) of Xi . x being the empirical average 2(1-2). Check that T is not unli ator propose an unbiased estimator of θ(1-0).
3. Let Yi,... , Y be a random sample from a distribution with probability mass function f(a; ?)-|(1-0)20" a--1 a=0,1,2, where 0 01 a. [6 pts] Show that the maximum likelihood estimator of ? is Hint: With the use of indicator functions, a Bernoulli distribution can be written as f(a; ?)-8111}(a) + (1-0)1101 (a) or, equivalently, One of these will simplify the likelihood equation for this problem.