Deep Learning

Neural Network: The Dead Neuron

The biggest drawback of ReLU activation function

Luthfi Ramadhan
Towards Data Science
6 min readNov 16, 2021

--

Photo by Josh Riemer on Unsplash

Choosing an activation function for the hidden layer is not an easy task. The configuration of the hidden layer is an extremely active topic of research, and it just doesn’t have any theory about how many neurons, how many layers, and what activation function to use given a…

--

--