Types of Optimization Algorithms used in Neural Networks and Ways to Optimize Gradient Descent
15 min readJun 10, 2017
Have you ever wondered which optimization algorithm to use for your Neural network Model to produce slightly better and faster results by updating the Model parameters such as Weights and Bias values? Should we use Gradient Descent or Stochastic gradient Descent or Adam?
I too didn’t know about the major differences between these different types of Optimization Strategies and which one is better over another before writing this article.