Problem

Prove that the perceptron training algorithm given in Eqs. (12.2-34) through (12.2-36) con...

Prove that the perceptron training algorithm given in Eqs. (12.2-34) through (12.2-36) converges in a finite number of steps if the training pattern sets are linearly separable. [Hint: Multiply the patterns of class ω2 by − 1 and consider a nonnegative threshold, T, so that the perceptron training algorithm (with c = 1) is expressed as w(k + 1) = w(k),if wT(k)y(k) > T, and w(k + 1) = w(k) + y(k) otherwise. You may need to use the Cauchy-Schwartz inequality: ||a||2||b||2 (aTb)2.]

(12.2-34)

(12.2-35)

(12.2-36)

Step-by-Step Solution

Request Professional Solution

Request Solution!

We need at least 10 more requests to produce the solution.

0 / 10 have requested this problem solution

The more requests, the faster the answer.

Request! (Login Required)


All students who have requested the solution will be notified once they are available.
Add your Solution
Textbook Solutions and Answers Search
Solutions For Problems in Chapter 12