Quickprop, an Alternative to Back-Propagation

Scott Fahlman’s idea to speed up gradient descent

Johanna Appel
Towards Data Science
9 min readAug 25, 2020

--

Animation of Quickprop

Due to the slowly converging nature of the vanilla back-propagation algorithms of the ’80s/’90s, Scott Fahlman invented a learning algorithm dubbed Quickprop [1] that is roughly based on Newton’s method. His simple idea outperformed back-propagation (with various adjustments) on problem domains like the…

--

--

Tech Entrepreneur, interested in Machine Intelligence, New Rationality, Maths, Physics … you name it.