Question

You should submit your report in ONE file (in PDF or MS Word format). Your report must contain the original R code, with outputs and a clear statement of the final answer. Partial

points will be rewarded only when you have the correct R code. Font size should be 10 - 12pt and plots (if any) should be clearly labeled.

Question 3 (extra credit 5 points): In the simple linear regression we introduced in the class, vertical distance is used (left panel as above). As we discussed, alternatively, you can also consider the perpendicular distance (right panel as above). Please derive Bo and B for the same problem of minimizing the sum of squared distance but with such perpendicular distance.

0 0
Add a comment Improve this question Transcribed image text
Answer #1

Define the squared perpendicular distance of observed data (i,vi) i- 1,2,..., n from the line is given by where, (X,, Yi) denotes the ith pair of observation without any error i.e 7L Now, minimize di under the constraints Es using Lagrangians multiplier method 7L 7L Define, Δ- di-2 Xi Ei, where, X1, x2, , xn are lagrangian multipliers OA 7l -0 〉 = 0 (3), อั30 7t

Now~from~(1)~and~(2)\Rightarrow E_i=y_i+\lambda_i-\beta_0-\beta_1(x_i-\lambda_i\beta_1)=0\\ \Rightarrow \lambda_i=\frac{\beta_0+\beta_1x_i-y_i}{1+\beta_1^2}\\ Now,~from~(3),~\sum_{i=1}^n\lambda_i=0\Rightarrow \sum_{i=1}^n(\beta_0+\beta_1x_i-y_i)=0\\ \Rightarrow \hat{\beta}_0=\bar{y}-\hat{\beta}_1\bar{x},~where,~\bar{x}=\frac{1}{n}\sum_{i=1}^nx_i,~\bar{y}=\frac{1}{n}\sum_{i=1}^ny_i\\ From~(4),~\sum_{i=1}^n\lambda_iX_i=0\Rightarrow \sum_{i=1}^n(\beta_0+\beta_1x_i-y_i)X_i=0\\ \Rightarrow \sum_{i=1}^n(\beta_0+\beta_1x_i-y_i)(x_i-\lambda_i\beta_1)=0\\ \sum_{i=1}^n((y_i-\bar{y})-\beta_1(x_i-\bar{x}))((1+\beta_1^2)x_i-(-(y_i-\bar{y})+\beta_1(x_i-\bar{x}))\beta_1)=0\\ or,~(1+\beta_1^2)\sum_{i=1}^nx_i((y_i-\bar{y})-\beta_1(x_i-\bar{x}))+\beta_1\sum_{i=1}^n(-(y_i-\bar{y})+\beta_1(x_i-\bar{x}))^2=0\\ Take,~u_i=x_i-\bar{x},~v_i=y_i-\bar{y},~then\\ \sum_{i=1}^n(\beta_1^2u_iv_i+\beta_1(u_i^2-v_i^2)-u_iv_i)=0,~sinve~\sum_{i=1}^nu_i=\sum_{i=1}^nv_i=0\\or,~\beta_1^2s_{xy}+\beta_1(s_x^2-s_y^2)-s_{xy}=0\\ then~\hat{\beta}_1=\frac{s_{yy}-s_{xx}+sign(s_{xy})\sqrt{(s_{xx}-s_{yy})^2+4s_{xy}^2}}{2s_{xy}}\\ where,~sign(s_{xy})=1~if~s_{xy}>0\\ =-1~if~s_{xy}<0

R code:

x=c(11,12,13,14,15,16,17,18,19,20) # values of independent variable
y=c(30,29,29,25,24,24,24,21,18,15) # values of dependent variable
n=length(x) # no. of pair observations
s_xx=sum(x^2)-(sum(x))^2/n
s_yy=sum(y^2)-(sum(y))^2/n
s_xy=sum(x*y)-(sum(x)*sum(y))/n
bar_x=mean(x)
bar_y=mean(y)
sign=0
if(s_xy>0){
sign=1
} else{
sign=-1
}
beta1=(s_yy-s_xx+sign*sqrt((s_xx-s_yy)^2+4*s_xy^2))/(2*s_xy)
beta0=bar_y-beta1*bar_x

beta0

beta1

Add a comment
Know the answer?
Add Answer to:
You should submit your report in ONE file (in PDF or MS Word format). Your report...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT