Question

Q1 a) Explain what it means that the ordinary least squares regression estimator is a linear...

Q1

a) Explain what it means that the ordinary least squares regression estimator is a linear estimator, paying specific attention to how it implies independent variables interact with each other.

b) Give two examples of models where the parameters of interest cannot be directly estimated using OLS regression because of nonlinear relationships between them.

c) What is the minimum set of conditions necessary for the OLS estimator to be the most efficient unbiased estimator (BLUE) of a parameter? List each of these minimum conditions and explain what they mean in one or two sentences.

b) Choose any two of the conditions and for each one (i) explain what could go wrong in estimating a model should the condition not hold, and (ii) give one real world example (each) of a research design where the condition might not be satisfied.

0 0
Add a comment Improve this question Transcribed image text
Answer #1

a)In statistics, ordinary least squares (OLS) is a type of linear least squares method for estimating the unknown parameters in a linear regression model. OLS chooses the parameters of a linear function of a set of explanatory variables by the principle of least squares: minimizing the sum of the squares of the differences between the observed dependent variable (values of the variable being observed) in the given dataset and those predicted by the linear function.Geometrically, this is seen as the sum of the squared distances, parallel to the axis of the dependent variable, between each data point in the set and the corresponding point on the regression surface – the smaller the differences, the better the model fits the data.

In some applications, especially with cross-sectional data, an additional assumption is imposed — that all observations are independent and identically distributed. This means that all observations are taken from a random sample which makes all the assumptions listed earlier simpler and easier to interpret. Also this framework allows one to state asymptotic results (as the sample size n → ∞), which are understood as a theoretical possibility of fetching new independent observations from the data generating process. The list of assumptions in this case is:

  • iid observations: (xi, yi) is independent from, and has the same distribution as, (xj, yj) for all i ≠ j;
  • no perfect multicollinearity: Qxx = E[ xi xiT ] is a positive-definite matrix;
  • exogeneity: E[ εi | xi ] = 0;
  • homoscedasticity: Var[ εi | xi ] = σ2

b) a nonlinear model is any model of the basic form,y=f(x⃗ ;β⃗ )+ε,in which

  1. the functional part of the model is not linear with respect to the unknown parameters, β0,β1,…, and
  2. the method of least squares is used to estimate the values of the unknown parameters.

Due to the way in which the unknown parameters of the function are usually estimated, however, it is often much easier to work with models that meet two additional criteria: the function is smooth with respect to the unknown parameters, and the least squares criterion that is used to obtain the parameter estimates has a unique solution.

Some examples of nonlinear models include

f(x;β⃗ )=β0+β1x1+β2x

f(x;β⃗ )=β1xβ2

f(x;β⃗ )=β0+β1exp(−β2x)

c) Linear-  the linear property of OLS estimator means that OLS belongs to that class of estimators, which are linear in Y, the dependent variable

Unbiasedness-Unbiasedness is one of the most desirable properties of any estimator. The estimator should ideally be an unbiased estimator of true parameter/population values.

Minimum Variance-

  1. If the estimator is unbiased but doesn’t have the least variance – it’s not the best!
  2. If the estimator has the least variance but is biased – it’s again not the best!
  3. If the estimator is both unbiased and has the least variance – it’s the best estimator.   

Consistency-

f its value approaches the actual, true parameter (population) value as the sample size increases. An estimator is consistent if it satisfies two conditions:It is asymptotically unbiased,Its variance converges to 0 as the sample size increases.

Add a comment
Know the answer?
Add Answer to:
Q1 a) Explain what it means that the ordinary least squares regression estimator is a linear...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT