Question

1. Which of the following conditions will lead to a smaller variance for the intercept estimator...

1. Which of the following conditions will lead to a smaller variance for the intercept estimator for your multiple regression model?

(A) X values cluster far from the origin of the X axis

(B) X values closely pack around the mean of X in your sample

(C) Small sample sizes

(D) High correlation among the explanatory variables

(E) Small error variance in the population regression function

2. R-squared

(A) measures the proportion of variability of the dependent variable that is explained by the explanatory variables

(B) measures the degree of linear association between the dependent and the explanatory variables

(C) measures the goodness-of fit-of a linear model

(D) all of the above

3. If X has a negative effect on Y and Z has a positive on Y, and X and Z are negatively correlated, what is the expected consequence of omitting Z from a regression of Y on X?

(A) The estimated coefficient on X will be biased downwards (too negative)

(B) The estimated coefficient on X will be biased upwards (insufficiently negative)

(C) The estimated coefficient on X will be biased upwards to the point of becoming positive

(D) The estimated coefficient on X (which should be positive) will be biased downwards to the point of becoming negative

4. We might consider introducing a quadratic term for some X variables in regression models

(A) if theory or intuition tells us that the effect on Y of a given change in X will not always be the same for all values of X

(B) because the quadratic term might be a good replacement for the linear term in X

(C) because R-squared only works if all the variables in the model are squared

(D) when we are certain that there are no non-negative values of X

5. If you wish to use a set of dummy variables to capture six different categories of and explanatory factor,

(A) you should use 5 dummy variables representing any 5 categories

(B) you should use 6 dummy variables representing 6 categories

(C) you should use 7 dummy variables representing the 6 categories plus a constant term

(D) you should use 4 dummy variables representing any 4 categories

6. The error term in linear regression models is assumed:

(A) having the mean of zero

(B) having the variance of zero

(C) being normally distributed with a positive mean

(D) being normally distributed with a negative mean

7.How should β k in the general multiple regression model be interpreted?

(A) The number of units of change in the expected value of Y for a 1 unit increase in X k when all remaining variables are unchanged.

(B) The magnitude by which X k varies in the model

(C) The amount of variation in Y explained by X k in the model.

(D) The number of variables used in the model.

8. What are the consequences of using least squares when heteroskedasticity is present?

(A) None of the above

(B) Confidence intervals and hypothesis testing are inaccurate due to inflated standard errors.

(C) All coefficient estimates are biased for variables correlated with the error term.

(D) No consequences, coefficient estimates are still unbiased.

9. The following equation has been used to estimate wages:

ln ⁡ ( Y ) = β 1 + β 2 E D U + β 3 E X P E R + β 4 E X P E R 2 + e

where Y is income, EDU is years of education and EXPER is experience in the field. If you suspect that males earn higher wages than females and that the wage difference increases with education, how would you adjust the econometric model to estimate wages?

(A) Include a binary variable for MALE and an interaction term equal to MALE*EDU

(B) Include a binary variable for gender, MALE.

(C) Include an interaction term equal to MALE*EDU.

(D) Include an indicator variable for MALE and one for FEMALE.

10. You have estimated the following equation using OLS:

y ^ = 33.75 + 1.45 M A L E

where y is annual income in thousands and MALE is an indicator variable such that it is 1 for males and 0 for females. According to this model, wha tis the average income for females?

(A) $33,750

(B) $35,200

(C) $32,300

(D) Cannot be determined.

0 0
Add a comment Improve this question Transcribed image text
Answer #1

Question 1.

The variance of intercept coefficient is given by,

Var(b1) = \sigma ^2/(\sum(Xi - mean of X)^2

Here \sigma ^2 = error variance of population regression function

As you can see the variance of intercept coefficient is directly related to error variance. Hence for the variance of intercept coefficient to be small error variance must be small.

So the correct option is option e.

Question 2.

R squared is the measure of proportion of variability of dependent variable that is explained by the explanatory variable. And also R squared is a measure of goodness of fit of regression model and also it also measure the degree of linear association between dependent and explanatory variables.

So the correct option is option D, all of the above.

Question 3.

So X and Z are negative related which means if one increases other decreases. And X is negatively related with Y which means if X increases Y decreases and Y and Z are positively related which means that of Z increases Y also increases.

So if we omit Z from the regression Y on X then there is nor force acting on Y upwards so X will pull Y downwards more. So the correct option must be option A. That is the estimated coefficient of X will be biased downwards( too negative).

So the correct option is option A.

Question 4.

We should only introoa quadratic term if we expect that the effect on Y on is not same for all values of X and the effect might be increasing with increasing value of X.

So the correct option is option A.

Question 5.

Whenever there n categories in our model and we want to incorporate dummy variables we should always include only (n-1) dummy variables to avoid the problem of dummy trap.

Here we have 6 categories so we should be using only 5 dummy variables representing any 5 categories.

So the correct option is option A.

Under the HomeworkLib guidelines we are only suppose to at most 1 question at a time but I did 5 I hope you appreciate. For rest of the question I would request you to please repost them and we will be more than happy to answer them.

Thank you

Add a comment
Know the answer?
Add Answer to:
1. Which of the following conditions will lead to a smaller variance for the intercept estimator...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
  • 4. The following is the output of linear regression analysis, which includes dummy variables and interactions....

    4. The following is the output of linear regression analysis, which includes dummy variables and interactions. The following are the variables: Y = Birth weights of infants born in preterm in three hospitals (A, B and C) X = Gestation age in weeks flif infant was born in Hospital A 10 Otherwise s X2= flif infant was born in Hospital B 10 Otherwise Variable Coefficient Standard deviation 1 P (approximate) Constant -1.1361 4904 .07648 01523 .7433 .6388 X -.8239 .6298...

  • Prove that the OLS estimator As for β in the linear regression model is consistent Let's first sh...

    Taking the yellow parts below as a model to solve the question above. Thank you!!!!!!!! Prove that the OLS estimator As for β in the linear regression model is consistent Let's first show that the OLS estimator is consistent Recall the result for β LS-(Lil Xix;厂E-1 xīYi Using Yi = X(B* + ui By the WLLN Assuming that E(X,X is non-negative definite (so that its inverse exists) and using Slutsky's theorem It follows In words: ßOLs converges in probability to...

  • 3. In the multiple regression model shown in the previous question, which one of the following st...

    3. In the multiple regression model shown in the previous question, which one of the following statements is incorrect: (b) The sum of squared residuals is the square of the length of the vector ü (c) The residual vector is orthogonal to each of the columns of X (d) The square of the length of y is equal to the square of the length of y plus the square of the length of û by the Pythagoras theorem In all...

  • 3. (20 pts) Suppose that we have 4 observations for 3 variables y , x\, X2...

    3. (20 pts) Suppose that we have 4 observations for 3 variables y , x\, X2 and consider a problem of regressing y on two (qualitative) variables x\, xz. Data y (Income) x (Gender) X2 (Management Status) obs no. Female None 2 Male None 3 Female Yes 4 Male Yes Y4 To handle the qualitative variables x\, x2, we define dummy variables z1, 22 as Male for for 1, 1, T2= Yes Z1= Z2= -1 for for 1 1 =...

  • 1.The Breusch-Pagan test for heteroskedasticity A. tests for a relationship between the estimated residuals and the...

    1.The Breusch-Pagan test for heteroskedasticity A. tests for a relationship between the estimated residuals and the independent variables B. tests for a relationship between the squared estimated residuals and the independent variables C. tests for a relationship between the estimated residuals and the dependent variable D. tests for a relationship between the squared estimated residuals and the dependent variable 2. In the presence of heteroskedasticity hypothesis testing is unreliable (T/F) 3. Plotting the residuals (predicted errors) against the independent variables...

  • 1.Which of the following assumptions is required to obtain a first-differenced estimator in a two-period panel...

    1.Which of the following assumptions is required to obtain a first-differenced estimator in a two-period panel data analysis? a. The idiosyncratic error at each time period is uncorrelated with the explanatory variables in both time periods. b. The variance of the error term in the regression model is not constant. c. The explanatory variable does not change over time for any cross-sectional unit. d. The explanatory variable changes by the same amount in each time period. 2.A Chow test _____....

  • Consider a linear regression model where y represents the response variable, x is a quantitative explanatory...

    Consider a linear regression model where y represents the response variable, x is a quantitative explanatory variable, and d is a dummy variable. The model is estimated as  yˆy^  = 14.6 + 4.5x − 3.4d. a. Interpret the dummy variable coefficient. Intercept shifts down by 3.4 units as d changes from 0 to 1. Slope shifts down by 3.4 units as d changes from 0 to 1. Intercept shifts up by 3.4 units as d changes from 0 to 1. Slope shifts...

  • Question 3 True/False/Explain 1. The variance of the OLS estimator of the coefficient of a certain...

    Question 3 True/False/Explain 1. The variance of the OLS estimator of the coefficient of a certain variable X; in a regression is higher, the higher is the degree of collinearity between that variable and the other regressors included in the model 2. Suppose that we estimated the following hourly wage equation n(wage) .092educ + .0041ехреr + (007) 022tenure 284 (0017) (.104) (003) where the numbers in parentheses are estimated standard errors. Assuming that the classical normal linear regression model holds,...

  • Consider a linear regression model where y represents the response variable, x is a quantitative explanatory...

    Consider a linear regression model where y represents the response variable, x is a quantitative explanatory variable, and d is a dummy variable. The model is estimated as  yˆy^  = 14.4 + 4.6x − 3.1d. a. Interpret the dummy variable coefficient. Intercept shifts down by 3.1 units as d changes from 0 to 1. Slope shifts down by 3.1 units as d changes from 0 to 1. Intercept shifts up by 3.1 units as d changes from 0 to 1. Slope shifts...

  • When two explanatory variables are highly correlated, should you remove one of the correlated explanatory variables...

    When two explanatory variables are highly correlated, should you remove one of the correlated explanatory variables to reduce the multicollinearity problem. A. Yes, it will reduce the standard errors on the coefficients and increase the t statistics. B. No, it will not affect the t statistics on the coefficients. C. No, it will cause the coefficient on the remaining variable to be biased. D. Yes, it will improve the fit of the regression model.

ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT