The perceptron algorithm given in Eqs. through can be expressed in a more concise form by multiplying the patterns of class ω2 by −1, in which case the correction steps in the algorithm become w (k + 1) = w (k ),if wT (k )y (k )> 0, and w (k + 1) = w (k )+ cy (k )otherwise. This is one of several perceptron algorithm formulations that can be derived by starting from the general gradient descent equation
where c > 0, J(w, y) is a criterion function, and the partial derivative is evaluated at w = w (k ). Show that the perceptron algorithm formulation is obtainable from this general gradient descent procedure by using the criterion function where |arg | is the absolute value of the argument.
(Note: The partial derivative of wTy with respect to w equals y.)
(12.2-34)
We need at least 10 more requests to produce the solution.
0 / 10 have requested this problem solution
The more requests, the faster the answer.