اگ | | 4 7-8 9, 2 44 ] ی و و و 12.0 128 60 2 = 36 Regression: 5= 91-3 a.) Lineas - وردا لا لا + دریا (5- ;9) (3- 2) 1 د - ڈ - ولها لاوازتره ) > (2 - 1) (۰) - 803) + (25-36) (60) - 303) + (23 -26) -4- 812) + (32- 36) (78 - 8/ 1) + (4 - 3) (42 - 13) + (.2- 36) (۹۰۹ -813) F (5-36) (۱۱۶ - 813) + (55 -36) (12- (83) 6 -36) (12-8-813)
sw." į (ri-5) [y; 9) = 20842.7 (xi -)= 9231 SWT 3 Ż (Xinx ) (91-9) 20842.) – 2.25 9231 4231 iai Že (xiv)? (w , 22-25) wo = 81.3 - (2-25) (36) = 81.3 - 81 Tuoz 0-3 | y = 0.3 + 2-2520
6.) Gradient descent learning algorithm. XK+ =Xktdk skl Sto search direction: Negative in this problem grodient of f(x) which =-8 flxk) -- g'k y problems , as the the learning differens a-learning parameter - For this Parameter be of 0.5 is o.s. TO XK - Current value = we set | ba X = 3- guess the intial 14' 2: 25 ) Step 1: [935 = 7.8 Áo 3 3 5 0 2 0 5 4 3 225 X, = xo-fy = 3.5- (0.5) (2.25) - 3.5-1.125 at = 2.375
42.375 = 0.3 + 5.34 - 5-6 43.5-92.37= 7.8-5.6 = 2.46 The difference in larger so we do not converge. we got to next. Step 2: Lyx, = 5.6) 1 - 2- 375 g = 0.5 3' = 225 Od ron X2=X - xyl = 2.375 -(0-5) 2.25) 42 = 1.25) ert for own 9.25 = 0-3+2-812 3.11225 = 3:13 Yxı-Yx2 = 5-6-3.1983 -2.447 of - 2.447 & it.
Step_3 ул. = 3-13 Х; - X, — ) - =) - 25 - ) - 12 = 0-4-25 — о. | 29 Yo-j2s 2 б• S8 )
Step 4: Yxz 30-581 1X330125 Yx4=0.3 X4=xz-dy' = 0.15-0.125 xy= 0 Yo =0.3 so we converge at x=o as the minisum. a.) y = 0.3+2.250 b.) gradient descent at the this point x=0.