Quickprop, an Alternative to Back-Propagation
Scott Fahlman’s idea to speed up gradient descent
Published in
9 min readAug 25, 2020
Due to the slowly converging nature of the vanilla back-propagation algorithms of the ’80s/’90s, Scott Fahlman invented a learning algorithm dubbed Quickprop [1] that is roughly based on Newton’s method. His simple idea outperformed back-propagation (with various adjustments) on problem domains like the…