Question

4.7. Consider a two-layer feedforward ANN with two inputs a and b, one hidden unit c, and one output unit d. This network has

AGATION algorithm from Table 4.2 (p.98). This entails that you should assume that the hiddern unit c and the output unit d ar

BACKPROPAGATION(training examples, η, nin , nout , n hidden) Each training example is a pair of the form (i,i ), where i is t

4.7. Consider a two-layer feedforward ANN with two inputs a and b, one hidden unit c, and one output unit d. This network has five weights (wca, Wcb, Wco, Wse, Wao). where wro represents the threshold weight for unit x. Initialize these weights to the values (.1,.1,.1,.1,.1), then give their values after each of the first two training iterations of the BACKPROPAGATION algorithm. Assume learning rte '-.3, momentum α-: 0.9, incremental weight updates, and the following training examples: 0 1 0
AGATION algorithm from Table 4.2 (p.98). This entails that you should assume that the hiddern unit c and the output unit d are sigmoid units. Use stochastic gradient descent. This means that in iteration 1, you should present the first training example and update the weights. In iteration 2, you should present the second training example and update the weights again. It is in iteration 2 that momentum starts playing a role.
BACKPROPAGATION(training examples, η, nin , nout , n hidden) Each training example is a pair of the form (i,i ), where i is the vector of network input values, and i is the vector of target network output values. η is the learning rate (e.g., .05). nin is the number of network inputs, nhidden the number of units in the hidden layer, and nout the number of output units. The input from unit i into unit j is denoted xji, and the weight from unit i to unit j is denoted tD Create a feed-forward network with nin inputs, nhidden hidden units, and nour output units. Initialize all network weights to small random numbers (e.g., between -05 and.05). Until the termination condition is met, Do For each (z,i) in training examples, Do Propagate the input forward through the network 1. Input the instance i to the network and compute the output ou of every unit u in the network. errors backward through the network. 2. For each network output unit k, calculate its error term δk (T4.3) 3, For each hidden unit h , calculate its emor term (T4.4) keoutputs 4. Update each network weight wji where (T4.5)
0 0
Add a comment Improve this question Transcribed image text
Answer #1

-ixut H# Giventhat, Yate n 0 3 C 6:07 Wod. CwOC Wac wcdy d1 Ound Hee, Input a =1,66 Signoi d (a. wac ttw bc-o9woc - sgnoid LStep a AcIiNation layer w (P)] dp of Hidden calculate neuroni actual snoid 4:CP) iCP hidden the peuron Jayer actvation func tStep H step a and interalions Con tineLe go back Catirfied topmection until selec ted PYDCess weight taining NOw .535 o.533)

O. O0DO 7S D10075 wod 6 0.030 S O13 input will be weits NOLO these get signoid Coxo 08 we l xoo 0008 1) (D900) -0. 000 1S 0.5(0.90) C-0 00 Dwocnx b.9 x Sc °3 x0-9x cD.00) -0.00108 Awod= nx 0.9x Sd o -3. X Co.9) (-0.la 0039 wac- 0.1084 +6 Uact Awac wbNOco this beights get 9cE Signoi d (oxo.lo96 + x 6.9988 -o.81 xo1) in pats wil be a-o, We signoid Ca 97 88 0.0891): (6.9097 1

Add a comment
Know the answer?
Add Answer to:
4.7. Consider a two-layer feedforward ANN with two inputs a and b, one hidden unit c, and one output unit d. This net...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT