3. Let y = (yi..... Yn) be a set of re- sponses, and consider the linear...
012. (a) The ordinary least squares estimate of B in the classical linear regression model Yi = α + AXi + Ui ; i=1,2, , n and xi = Xi-K, X-n2Xī i- 1 Show that if Var(B-.--u , no other linear unbiased estimator of β n im1 can be constructed with a smaller variance. (All symbols have their usual meaning) 18
I. Consider a variable y = θ + where θ is an unknown parameter and e is a random variable with mean zero. (a) What is the expected value of y? (b) Suppose you draw a sample of yi yn. Derive the least squares estimator for θ. For full credit you must check the 2nd order condition c) Can this estimator (0) be described as a method of moments estimator? (d) Now suppose є is independent normally distributed with mean...
1. Consider the simple linear regression model: Ү, — Во + B а; + Ei, where 1, . . , En are i.i.d. N(0,02), for i1,2,... ,n. Let b1 = s^y/8r and bo = Y - b1 t be the least squared estimators of B1 and Bo, respectively. We showed in class, that N(B; 02/) Y~N(BoB1 T;o2/n) and bi ~ are uncorrelated, i.e. o{Y;b} We also showed in class that bi and Y 0. = (a) Show that bo is...
The linear regression model in matrix format is Y Xe, with the usual definitions. Let E(elX)- 0 and γ1 0 0 0 Y2 00 01 0 00 .0 0 0 00N 0 0 0'YN 0 0 0YNL Notice that as a covariance matrix, Σ is symmetric and nonnegative definite. ) Derive Var (BoLSX). (ii) Let A: = CY be any other linear unbiased estimator where C, is an N × K function of X. Prove Var (β|X) > (X'Σ-1X)-1. The...
2. Consider the simple linear regression model: where e1, .. . , es, are i.i.d. N (0, o2), for i= 1,2,... , n. Suppose that we would like to estimate the mean response at x = x*, that is we want to estimate lyx=* = Bo + B1 x*. The least squares estimator for /uyx* is = bo bi x*, where bo, b1 are the least squares estimators for Bo, Bi. ayx= (a) Show that the least squares estimator for...
Q. 1 Consider the multiple linear regression model Y = x3 + €, where e indep MV N(0,0²V) and V +In is a diagonal matrix. a) Derive the weighted least squares estimator for B, i.e., Owls. b) Show Bwis is an unbiased estimator for B. c) Derive the variances of w ls and the OLS estimator of 8. Is the OLS estimator of still the BLUE? In one sentence, explain why or why not.
Problem 3: Absence of Intercept Consider the regression model Y, = BX,+", where , and X, satisfy Assumptions SLR1-SLR5. Y (i) Let B denote an estimator of B that is constructed as P where Y and X as are the sample means of Y,and X,, respectively. Show that B is conditionally unbiased. Derive the least squares estimator of B. Show that the estimator is conditionally unbiased. Derive the conditional variance of the estimator. (ii) (iii) (iv) 2
4. (24 marks) Suppose that the random variables Yi,..., Yn satisfy Y-B BX,+ Ei, 1-1, , n, where βο and βι are parameters, X1, ,X, are con- stants, and e1,... ,en are independent and identically distributed ran- dom variables with Ei ~ N (0,02), where σ2 is a third unknown pa- rameter. This is the familiar form for a simple linear regression model, where the parameters A, β, and σ2 explain the relationship between a dependent (or response) variable Y...
Consider a simple linear regression model with nonstochastic regressor: Yi = β1 + β2Xi + ui. 1. [3 points] What are the assumptions of this model so that the OLS estimators are BLUE (best linear unbiased estimates)? 2. [4 points] Let βˆ and βˆ be the OLS estimators of β and β . Derive βˆ and βˆ. 12 1212 3. [2 points] Show that βˆ is an unbiased estimator of β .22
QUESTION 5 Suppose that Yı, Y2,.., Yn independent variables such that where β is an unknown parameter, X1, x2-.., xn are known real numbers, and el,e2 independent random errors each with a normal distribution with mean 0 and variance ơ2 ,en are (a) Show that is an unbiased estimator of β. What is the variance of the estimator? (b) Given that the probability density function of Y is elsewhere, show that the maximum likelihood estimator of β is not the...