Consider a random sample (X1, Y1),(X2, Y2), . . . ,(Xn, Yn) where Y | X = x is modeled by a N(β0 + βx, σ2 ) distribution, where β0, β1 and σ 2 are unknown.
(a) Prove that the mle of β1 is an unbiased estimator of β1.
(b) Prove that the mle of β0 is an unbiased estimator of β0.
If be a MLE, then is said to be an unbiased estimator then,
(a)
(b)
**If the answer does not match or any kind of confusion you have please comment
Consider a random sample (X1, Y1),(X2, Y2), . . . ,(Xn, Yn) where Y | X...
Suppose that X1, X2,.... Xn and Y1, Y2,.... Yn are independent random samples from populations with the same mean μ and variances σ., and σ2, respectively. That is, x, ~N(μ, σ ) y, ~ N(μ, σ ) 2X + 3Y Show that is a consistent estimator of μ.
a) Consider a random sample {X1, X2, ... Xn} of X from a uniform distribution over [0,0], where 0 <0 < co and e is unknown. Is п Х1 п an unbiased estimator for 0? Please justify your answer. b) Consider a random sample {X1,X2, ...Xn] of X from N(u, o2), where u and o2 are unknown. Show that X2 + S2 is an unbiased estimator for 2 a2, where п п Xi and S (X4 - X)2. =- п...
4. (24 marks) Suppose that the random variables Yi,..., Yn satisfy Y-B BX,+ Ei, 1-1, , n, where βο and βι are parameters, X1, ,X, are con- stants, and e1,... ,en are independent and identically distributed ran- dom variables with Ei ~ N (0,02), where σ2 is a third unknown pa- rameter. This is the familiar form for a simple linear regression model, where the parameters A, β, and σ2 explain the relationship between a dependent (or response) variable Y...
Let Y1,Y2, …… Yn be a random sample from the distribution f(y) = θxθ-1 where 0 < x < 1 and 0 < θ < ∞. Show that the maximum likelihood estimator (MLE) for θ is
. Suppose the Y1, Y2, · · · , Yn denote a random sample from a population with Rayleigh distribution (Weibull distribution with parameters 2, θ) with density function f(y|θ) = 2y θ e −y 2/θ, θ > 0, y > 0 Consider the estimators ˆθ1 = Y(1) = min{Y1, Y2, · · · , Yn}, and ˆθ2 = 1 n Xn i=1 Y 2 i . ii) (10 points) Determine if ˆθ1 and ˆθ2 are unbiased estimators, and in...
0 and an Let X1, X2, ..., Xn be a random sample where each X; follows a normal distribution with mean u unknown standard deviation o. Let K (n-1)s2 = n 202 (a) [2 points] Assume K ~ Gamma(a = n71,8 bias for K. *). We wish to use K as an estimator of o2. Compute the n (b) [1 point] If K is a biased estimator for o?, state the function of K that would make it an unbiased...
QUESTION 3 Let Y1, Y2, ..., Yn denote a random sample of size n from a population whose density is given by (Parcto distribution). Consider the estimator β-Yu)-min(n, Y, where β is unknown (a) Derive the bias of the estimator β. (b) Derive the mean square error of B. , Yn).
In 10. 11, Let X1, X2, . , Xn and Yi, Y2, . . . , Y,, be independent samples from N(μ, σ?) and N(μ, σ), respectively, where μ, σ. ơỈ are unknown. Let ρ-r/of and g m/n, and consider the problem of unbiased estimation of u In 10. 11, Let X1, X2, . , Xn and Yi, Y2, . . . , Y,, be independent samples from N(μ, σ?) and N(μ, σ), respectively, where μ, σ. ơỈ are unknown....
7.20 Consider Y1,...,Yn as defined in Exercise 7.19. (a) Show that Yilti is an unbiased estimator of B. (b) Calculate the exact variance of Yi/ xi and compare it to the variance of the MLE. 7.19 Suppose that the random variables Yı, ..., Yn satisfy Yi = Bli +ti, i = 1,...,n, where x1, ..., In are fixed constants, and €1,..., En are iid n(0,02), o2 unknown. (a) Find a two-dimensional sufficient statistic for (0,0%). (b) Find the MLE of...
1. (40) Suppose that X1, X2, Xn forms an independent and identically distributed sample from a normal distribution with mean μ and variance σ2, both unknown: 2nơ2 (a) Derive the sample variance, S2, for this random sample. (b) Derive the maximum likelihood estimator (MLE) of μ and σ2 denoted μ and σ2, respectively. (c) Find the MLE of μ3 (d) Derive the method of moment estimator of μ and σ2, denoted μΜΟΜΕ and σ2MOME, respectively (e) Show that μ and...