Bayesian regression Consider the Bayesian linear regression model with K regressors where (v) Now suppose that we have...
Bayesian regression Consider the Bayesian linear regression model with K regressors where (v) Now suppose that we have an uninformative prior such that Show that the posterior verifies 2a2 where VĮß-σ2 (XX)-1. (vi) Now suppose that there is only one regressor li (ie. K = 1). Show that o2 N2 vii) Comment on how the result in part (vi) relates to the choice of prior and standard frequentist (i.e. non-Bayesian) estimators. Bayesian regression Consider the Bayesian linear regression model with...
Consider the Bayesian linear regression model with K regressors where v) Now suppose that we have an uninformative prior such that Show that the posterior verifies: N/2 where VĮß-σ2 (XX)-1. Consider the Bayesian linear regression model with K regressors where v) Now suppose that we have an uninformative prior such that Show that the posterior verifies: N/2 where VĮß-σ2 (XX)-1.
Bayesian updating Suppose that we have the model y|μ ~ N(μ, τ-1) where τ > 0 is known and μ is an unknown parameter (vi) Suppose that ( of y with a -ab1. Suppose that you observe a realization Compute the posterior distribution value of 1. π(μ|1) and explain how it relates to π(μ). vii) Suppose now that you observe a second realization of y with a value of -1. Update the posterior π(p11) to incorporate this new information. Bayesian...
Bayesian updating Suppose that we have the model y|μ ~ N(μ, τ-1) where τ > 0 is known and μ is an unknown parameter (vi) Suppose that ( of y with a -ab1. Suppose that you observe a realization Compute the posterior distribution value of 1. π(μ|1) and explain how it relates to π(μ). vii) Suppose now that you observe a second realization of y with a value of -1. Update the posterior π(p11) to incorporate this new information. Bayesian...
Bayesian updating Suppose that we have the model y|μ ~ N(μ, τ-1) where τ > 0 is known and μ is an unknown parameter (iii) Suppose that we have a prior μ ~ N(a, b-1) where b > 0, Show that the prior distribution π(A) verifies r(11) x exp (iv) Show that the posterior π(μ|y) verifies (v) which distribution is π(μ|y)? Bayesian updating Suppose that we have the model y|μ ~ N(μ, τ-1) where τ > 0 is known and...
1. Consider a linear regression model of y on K regressors and an intercept. (i) Describe the Breusch-Pagan test of heteroskedasticity. (ii) What are the consequences for OLS estimation and testing of rejecting the null hypothesis of the BP test? (iii)What can you say about the form of Heteroskedasticity function implied by BP? What if it is wrong? (iv) Describe the test of heteroskedasticity proposed by White. (v) When there is only one regressor (K=1), give the expression for White’s...
Consider the following simple regression model: a. Suppose that OLS assumptions 1 to 4 hold true. We know that homoskedasticity assumption is statedas: Var[UjIx] = σ2 for all i Now, suppose that homoskedasticity does not hold. Mathematically, this is expressed as In other words, the subscript i in σ12 means that the conditional variance of errors for each individual i is different. Under heteroskedasticity, we can derive the expression for the variance of Var(B) as SST Where SSTx is the...
2. Consider the simple linear regression model: where e1, .. . , es, are i.i.d. N (0, o2), for i= 1,2,... , n. Suppose that we would like to estimate the mean response at x = x*, that is we want to estimate lyx=* = Bo + B1 x*. The least squares estimator for /uyx* is = bo bi x*, where bo, b1 are the least squares estimators for Bo, Bi. ayx= (a) Show that the least squares estimator for...
4. Consider the regression model, y1B22+ BKiK+ei -.. where errors may be heteroskedastic. Choose the most incorrect statement (a) The OLS estimators are consistent and unbiased (b) We should report the OLS estimates with the robust standard errors (c) The Gauss-Markov theorem may not apply (d) The GLS cannot be used because we do not know the error variances in practice (e) We should take care of heteroskedasticity only if homoskedasticity is rejected Consider the regression model, +BKIK+et e pet-1+...
Exercise 1 Answer the following questions: a. Consider the multiple regression model y-Xe subject to a set of linear constraints of the form Cß-γ, where C is mx (k + 1) matrix. The Gauss-Markov conditions hold and also ε ~ N(0, σ21) Is it true that we can test the hypothesis C9-γ using a k¿SSRfillm d SKreducedmodel Please explain b. Refer to question (a). Let H and Hi be the hat matrices of the full and reduced model respectively. Show...