Deep Learning
Neural Network: The Dead Neuron
The biggest drawback of ReLU activation function
Published in
6 min readNov 16, 2021
Choosing an activation function for the hidden layer is not an easy task. The configuration of the hidden layer is an extremely active topic of research, and it just doesn’t have any theory about how many neurons, how many layers, and what activation function to use given a…