In the simple linear regression with zero-constant
item for (xi
, yi) where i = 1, 2, · · · , n,
Yi = βxi + i
where {i}
n
i=1 are i.i.d. N(0, σ2
).
(a) Derive the normal equation that the LS estimator, βˆ,
satisfies.
(b) Show that the LS estimator of β is given by
βˆ =
Pn
i=1
P
xiYi
n
i=1 x
2
i
.
(c) Show that
E(βˆ) = β, V ar(βˆ) = σ
2
Pn
i=1 x
2
i
, and βˆ ∼ N
β, σ
2
Pn
i=1 x
2
i
.
(d) Let ˆi = Yi − βxˆ
i
. Show that cov( ˆi
, βˆ) = 0, and ˆi and βˆ are independent. Let
σˆ
2 =
1
n − 1
Xn
i=1
(Yi − βxˆ
i)
2 =
1
n − 1
Xn
i=1
ˆi
2
.
Show that ˆσ
2 and βˆ
0 are independent.
(e) It is given that
(n − 1)ˆσ
2
σ
2
∼ χ
2
n−1
.
Show that
βˆ − β
p
σˆ
2/(
Pn
i=1 x
2
i
)
S ∼ tn−1
f) Show that the 100(1 − α)% prediction interval of Y0 taken at x =
x0 is given by
βxˆ
0 ± tn−1,α/2 σˆ
s
1 +
x
2
P
0
n
i=1 x
2
i
,
where tn−1,α/2 is the upper α/2 percentile of a t distribution with
(n-1) degree of freedom
In the simple linear regression with zero-constant item for (xi , yi) where i = 1, 2, · · · , n, Yi = βxi + i where {i} n i=1 are i.i.d. N(0, σ2 ). (a) Derive the normal equation that the LS estimator...
3. Consider the linear model: Yİ , n where E(Ei)-0. Further α +Ari + Ei for i 1, assume that Σ.r.-0 and Σ r-n. (a) Show that the least square estimates (LSEs) of α and ß are given by à--Ỹ and (b) Show that the LSEs in (a) are unbiased. (c) Assume that E(e-σ2 Yi and E(49)-0 for all i where σ2 > 0. Show that V(β)--and (d) Use (b) and (c) above to show that the LSEs are consistent...
Consider the regression model where the εi are i.i.d. N(0,σ2) random variables, for i = 1, 2, . . . , n. (a) (4 points) Show βˆ is normally distributed with mean β and variance σ2 . 1 1SXX Question 6 Consider the regression model y = Bo + B12 + 8 where the €, are i.i.d. N(0,0%) random variables, for i = 1,2, ..., n. (a) (4 points) Show B1 is normally distributed with mean B1 and variances
We have a dataset with n = 10 pairs of observations (xi; yi), and Xn i=1 xi = 683; Xn i=1 yi = 813; Xn i=1 x2i = 47; 405; Xn i=1 xiyi = 56; 089; Xn i=1 y2 i = 66; 731: What is the coefficient of correlation for this data? We have a dataset with n= 10 pairs of observations (li, yi), and n n Σ ti = 683, Σ9: = 813, i=1 η α? = 47, 405,...
QUESTION 2 Let Xi.. Xn be a random sample from a N (μ, σ 2) distribution, and let S2 and Š-n--S2 be two estimators of σ2. Given: E (S2) σ 2 and V (S2) - ya-X)2 n-l -σ (a) Determine: E S2): (l) V (S2); and (il) MSE (S) (b) Which of s2 and S2 has a larger mean square error? (c) Suppose thatnis an estimator of e based on a random sample of size n. Another equivalent definition of...
We have a dataset with n = 10 pairs of observations (xi; yi), and Xn i=1 xi = 683; Xn i=1 yi = 813; Xn i=1 x2i = 47; 405; Xn i=1 xiyi = 56; 089; Xn i=1 y2 i = 66; 731: What is an approximate 99% confidence interval for the intercept of the line of best fit? We have a dataset with n= 10 pairs of observations (ri, Yi), and n n Σ Xi = 683, 2 yi...
We have a dataset with n = 10 pairs of observations (xi; yi), and Xn i=1 xi = 683; Xn i=1 yi = 813; Xn i=1 x2i = 47; 405; Xn i=1 xiyi = 56; 089; Xn i=1 y2 i = 66; 731: What is an approximate 95% confidence interval for the mean response at x0 = 90? We have a dataset with n = 10 pairs of observations (li, Yi), and n n Σ Xi = 683, Yi =...
Let Yi = Xiß + d E(eiXi) = 0. You observe (X,, Yi) with XXri where ri is a random error. Derive the probability limit of the OLS estimator in the regression of Yi on X,. For simplicity, assume that EX Er0 Your probability limit should have the form β(1-stuff), where stuff depends only on the population variances of ri and X¡. The correct result will highlight that if stuff < 1 then the probability limit of the OLS estimator...
2. Let Yi-Au + β124 + εί, (jz 1,2, . . . ,n), where the εί are independent N(0, σ2). ow that the correlation coefficient ofBo and βί is-n /(nDx (b) Derive an F-statistic for testing H : β-0. 2. Let Yi-Au + β124 + εί, (jz 1,2, . . . ,n), where the εί are independent N(0, σ2). ow that the correlation coefficient ofBo and βί is-n /(nDx (b) Derive an F-statistic for testing H : β-0.
2. Consider the simple linear regression model: where e1, .. . , es, are i.i.d. N (0, o2), for i= 1,2,... , n. Suppose that we would like to estimate the mean response at x = x*, that is we want to estimate lyx=* = Bo + B1 x*. The least squares estimator for /uyx* is = bo bi x*, where bo, b1 are the least squares estimators for Bo, Bi. ayx= (a) Show that the least squares estimator for...
1.(c) 2.(a),(b) 5. Let Xi,..., X, be iid N(e, 1). (a) Show that X is a complete sufficient statistic. (b) Show that the UMVUE of θ 2 is X2-1/n x"-'e-x/θ , x > 0.0 > 0 6. Let Xi, ,Xn be i.i.d. gamma(α,6) where α > l is known. ( f(x) Γ(α)θα (a) Show that Σ X, is complete and sufficient for θ (b) Find ElI/X] (c) Find the UMVUE of 1/0 -e λ , X > 0 2) (x...