Ans. (a) By regularizing w2, for a large value of C value of w2 regularizes or decreases causing the approximated line to tend vertical and some part of both the classes remains on the same side of the line causing the training error to increase.
(b) Similarly by regularizing w1, for a large value of C value of w1 decreases causing the approximated line to tend horizontal and some part of both the classes remains on the same side of the line and the training error increases.
(c) By regularizing w0, for a large value of C approximated line remains inclined and the training error remains the same since both the classes can be easily distinguished.
I hope it solves your problem if you have any doubts please ask in the comments section, and if you liked the solution please upvote. Thanks.
can you please solve the question ? We try to solve the binary classification task ilustrated in the below figure with a simple linear log istic regression model Notice that the training data can...
Can you please solve the question which is vital for me, clealy? and d) By regularizing w_n We try to solve the binary classification task ilustrated in the below figure with a simple linear log istic regression model Notice that the training data can be separated with zero training error with a linear separator. Consider training regularized linear logistic regression models where we try to maximize for very large . The regularization penalties used in penalized conditional lag likelihood estimation...