Question

What is the vanishing gradient problem in neural networks? How can it be corrected?

What is the vanishing gradient problem in neural networks? How can it be corrected?

0 0
Add a comment Improve this question Transcribed image text
Answer #1

Vanishing Gradient Problem is a problem with gradient based methods (e.g Back Propagation). In particular, this problem makes it really hard to learn and tune the parameters of the earlier layers in the network.

As a result of Vanishing Gradient, a Deep Learning model takes longer time to train and learn from the data and sometimes may not train at all and show error. This results in less or no convergence of the neural network.

Due to Vanishing Gradient, your slope becomes too small and decreases gradually to a very small value (sometimes negative).

Possible solutions are:

  • Long Short Term Memory Networks
  • Faster Hardware
  • Other activation functions:
  • Residual Networks
Add a comment
Know the answer?
Add Answer to:
What is the vanishing gradient problem in neural networks? How can it be corrected?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT