3. (15 Points) Let Xi Bernoulli(p) and X2Bernoulli(3p) be independent Bernoulli random variables ...
2 Let X1, X2, ..., X, be independent continuous random variables from the following distribution: (*) - ar-(-) where I 21 and a > 1 You may use the fact: E[X] = -1 2.1 Show that the maximum likelihood estimator of a is â MLE - Srlos Xi 2.2 Show that the method moment estimator for a is: &mom = 1 2.3 Derive a sufficient statistic for a. What theorem are you using to determine sufficiency?
Let Xi, 1-1,2, , be independent Bernoulli() random variables and let Y,-ל 1-Xi. Use the delta method to find the limiting distribution of g(%)-YAI-%), for p # 2. 1
2 Let X1, X2, ...,X, be independent continuous random variables from the following distribution: f(3) = ox-(0-1) where : > 1 and a > 1 You may use the fact: E[X]- .- 2.1 Show that the maximum likelihood estimator of a is ômle = Ei log Xi 2.3 Derive a sufficient statistic for a. What theorem are you using to determine sufficiency? 2.4 Show that the fisher information in the whole sample is: 1(a)= 2.5 What Cramer Rao lower bound...
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...
Problem 5 Let Xi, X2, ..., Xn be a random sample from Bernoulli(p), 0 < p < 1, and 7.i. Prove that the sample proportion is an unbiased estimator of p, i.e. p,- is an unbiased estimator of p 7.ii. Derive an expression for the variance of p,n 7.iii. Prove that the sample proportion is a consistent estimator of p. 7.iv. Prove that pn(1- Pn)
5. Let Xi, , X, (n 3) be iid Bernoulli random variables with parameter θ with 0<θ<1. Let T = Σ_iXi and 0 otherwiase. (a) Derive Eo[6(X,, X.)]. (b) Derive Ee16(X, . . . , Xn)IT = t], for t = 0, i, . . . , n.
Question 4 [15 marks] The random variables X1,... , Xn are independent and identically distributed with probability function Px (1 -px)1 1-2 -{ 0,1 fx (x) ; otherwise, 0 while the random variables Yı,...,Yn are independent and identically dis- tributed with probability function = { p¥ (1 - py) y 0,1,2 ; otherwise fy (y) 0 where px and py are between 0 and 1 (a) Show that the MLEs of px and py are Xi, n PY 2n (b)...
3. Suppose Xi, X2, and X are independent random variables drawn from a binomial distribution with parameters p and n. The observed values are Xi -3, X2-4, and (a) Suppose n 12 and p is unknown. What is the maximum likelihood estimator (b) Suppose p - 0.4 and n is unknown. What is the maximum likelihood estimator for p? for n? (Note: Since n is discrete you can't use calculus for this; just write the formula and use trial and...
Let Xi, X2,... , Xn denote independent and identically distributed uniform random variables on the interval 10, 3β) . Obtain the maxium likelihood estimator for B, B. Use this estimator to provide an estimate of Var[X] when r1-1.3, x2- 3.9, r3-2.2
Let Xi and X2 independent random variables, with distribution functions F1, and F2, respectively Let Y a Bernoulli random variable with parameter p. Suppose that Y, X1 and X2 are independent. Proof using the de finition of distribution function that the the distribution function of Z =Y Xit(1-Y)X2 is F = pF14(1-p)F2 Don't use generatinq moment functions, characteristic functions) Xi and X2 independent random variables, with distribution functions F1, and F2, respectively Let Y a Bernoulli random variable with parameter...