Problem 1.1 Consider a family of probability distributions with parameterized by 0 e 11.2,3). The following...
3. Consider a random sample Yı, ,Yn from a Uniform[0, θ]. In class we discussed the method of ,y,). We moment estimator θ-2Y and the maximum likelihood estimator θ-maxx,Yo, derived the Bias and MSE for both estimators. With the intent to correct the bias of the mle θ we proposed the following new estimator -Imax where the subscript u stands for "unbiased." (a) Find the MSE of (b) Compare the MSE of θυ to the MSE of θ, the original...
14. For each of the following distributions, derive a general expression for the Maximum Likelihood Estimator (MLE). Carry out the second derivative test to make sure you really have a maximum. Then use the data to calculate a numerical estimate. (a) p(z) = θ(1-θ)" forェ= 0, 1, , where 0 < θ < 1 . Data: 4, o, 1, o, 1, 3, (b) f(x)-гет forz > 1, where cr > 0. Data: 1.37, 2.89, 1.52, 1.77, 1.04, (c) f(z)=ア-e_f, for...
Question.1 (11 Marks] Let X be a random variable with the following pdf: f(x; 8) 1>0, >0. 1 (1) (a) Show that I f(x;0)d.r = 1 (b) is the pdf f(c;) member of the Exponential family distributions? Justify in details your answer (c) Find a sufficient statistic for the unknown parameter 8. (d) Find a maximum likelihood estimator for 6.
Q1. Consider a random variable Y having probability density function otherwise. Given Yi, . . . , Yn, a sequence of г.г.d. observations on y 1. Determine the maximum likelihood estimator (MLE) of o. Denote this estimator, associated with a sample of size n, as d. Derive the score function, denoted by Sn (δ)-Olog ΓΤ:-1.fy (y|δ) Эд and show that it has an expected value of zero 3, Derive the information per observation. Эд and show that it is equal...
Please give detailed steps. Thank you. 5. Let {X, : i-1..n^ denote a random sample of size n from a population described by a random varaible X following a Poisson(θ) distribution with PDF given by θ and var(X) θ (i.e. you do not You may take it as given that E(X) need to show these) a. Recall that an estimator is efficient, if it satisfies 2 conditions: 2) it achieves the Cramer-Rao Lower Bound (CLRB) for unbiased estimators: Show that...
Instructions: For each of the following distributions, compute the maximum likelihood estimator based on n i.d. observations X····, Xn and the Fisher information, if defined. If it is not, enter DNE in each applicable input box. which means that each X1 has density exp (-( 1)2 202 Hint: Keep in mind that we consider σ2 as the parameter, not σ . You may want to write τ-σ2 in your computation. (Enter barx_n for the sample average Xn and bar(X_n 2)...
4. Let X1, . . . , Xn be a random sample from a normal random variable X with probability density function f(x; θ) = (1/2θ 3 )x 2 e −x/θ , 0 < x < ∞, 0 < θ < ∞. (a) Find the likelihood function, L(θ), and the log-likelihood function, `(θ). (b) Find the maximum likelihood estimator of θ, ˆθ. (c) Is ˆθ unbiased? (d) What is the distribution of X? Find the moment estimator of θ, ˜θ.
5. Consider a random sample Y1, . . . , Yn from a distribution with pdf f(y|θ) = 1 θ 2 xe−x/θ , 0 < x < ∞. Calculate the ML estimator of θ. 6. Consider the pdf g(y|α) = c(1 + αy2 ), −1 < y < 1. (a) Show that g(y|α) is a pdf when c = 3 6 + 2α . (b) Calculate E(Y ) and E(Y 2 ). Referencing your calculations, explain why M1 can’t be...
(9) [12 pts] An exponentially distributed random variable, call it X, has the following probability density functior f(x)-oe ex , x > 0, θ > 0 Note that ElX] and VX]ー1 For the rest of this question, assume that you have a data set (xn1 consisting of a random sample of N observations of X. (a) Derive two different Method of Moments estimators for θ. HINT: remember that the MOM is based on the analogy principle, or the idea that...
3. [20 marks] Consider the multinomial distribution with 3 categories, where the random variables Xi, X2 and X3 have the joint probability function where x = (zi, 2 2:23), θ = (θί, θ2), n = x1 + 2 2 + x3, θι, θ2 > 0 and 1-0,-26, > 0. (a) [4 marks] Find the maximum likelihood estimator θ of θ. (b) [4 marks] Find that the Fisher information matrix I(0) (c) [4 marks] Show that θ is an MVUE. (d)...