3. Suppose Xi, X2, and X are independent random variables drawn from a binomial distribution with parameters p and n. The observed values are Xi -3, X2-4, and (a) Suppose n 12 and p is unknown. What...
e (4 marks) Let m be an integer with the property that m 2 2. Consider that X1, X2,.. ., Xm are independent Binomial(n,p) random variables, where n is known and p is unknown. Note that p E (0,1). Write down the expression of the likelihood function We assume that min(x1, . . . ,xm) 〈 n and max(x1, . . . ,xm) 〉 0 5 marks) Find , and give all possible solutions to the equation dL dL -...
are independent variables from Negative Binomial distribution with parameters (known) and . Find the maximum likelihood estimator of .
Assume that we have three independent observations: where Xi ~ Binomial(n 7,p) for i E { 1.2.3). The value of p E (0, 1) is not known. When we have observations like this from different, independent ran- dom variables, we can find joint probabilities by multiplying together th ndividual probabilities. For example This should remind you the discussion on statistical independence of random variables that can be found in the course book (see page 22) Answer the following questions a...
3. (15 Points) Let Xi Bernoulli(p) and X2Bernoulli(3p) be independent Bernoulli random variables where p E [0, 1/3]. Derive the Maximum Likelihood Estimator (MLE) of p. Denote it by p. 3. (15 Points) Let Xi Bernoulli(p) and X2Bernoulli(3p) be independent Bernoulli random variables where p E [0, 1/3]. Derive the Maximum Likelihood Estimator (MLE) of p. Denote it by p.
2 Let X1, X2, ...,X, be independent continuous random variables from the following distribution: f(3) = ox-(0-1) where : > 1 and a > 1 You may use the fact: E[X]- .- 2.1 Show that the maximum likelihood estimator of a is ômle = Ei log Xi 2.3 Derive a sufficient statistic for a. What theorem are you using to determine sufficiency? 2.4 Show that the fisher information in the whole sample is: 1(a)= 2.5 What Cramer Rao lower bound...
2 Let X1, X2, ..., X, be independent continuous random variables from the following distribution: (*) - ar-(-) where I 21 and a > 1 You may use the fact: E[X] = -1 2.1 Show that the maximum likelihood estimator of a is â MLE - Srlos Xi 2.2 Show that the method moment estimator for a is: &mom = 1 2.3 Derive a sufficient statistic for a. What theorem are you using to determine sufficiency?
Suppose that X is a random variable from a binomial distribution with parameters n=12 and p. Consider the point estimate p̂=X/14 1. what's the bias of this estimate? 2. what is the value of the mean square error of this estimate if the actual value of p is 0.735
QUESTION 2 Let Xi.. Xn be a random sample from a N (μ, σ 2) distribution, and let S2 and Š-n--S2 be two estimators of σ2. Given: E (S2) σ 2 and V (S2) - ya-X)2 n-l -σ (a) Determine: E S2): (l) V (S2); and (il) MSE (S) (b) Which of s2 and S2 has a larger mean square error? (c) Suppose thatnis an estimator of e based on a random sample of size n. Another equivalent definition of...
4. (24 marks) Suppose that the random variables Yi,..., Yn satisfy Y-B BX,+ Ei, 1-1, , n, where βο and βι are parameters, X1, ,X, are con- stants, and e1,... ,en are independent and identically distributed ran- dom variables with Ei ~ N (0,02), where σ2 is a third unknown pa- rameter. This is the familiar form for a simple linear regression model, where the parameters A, β, and σ2 explain the relationship between a dependent (or response) variable Y...
Suppose X1, X2, . . . are independent discrete random variables, having the same distribution, and E[Xi] > 0, for each i. Is thus true for any two positive integers n and m?: Why not, or why yes?