Netflix Data Science Interview Practice Problems
A walkthrough of some of Netflix’s interview questions!
Netflix is one of the most elite tech companies in the world, so it’s no surprise that their data science interview questions are much more challenging. Below are several interview questions that have been previously asked in Netflix’ data science interviews and my attempts at answering them.
Q: Why is Rectified Linear Unit a good activation function?
The Rectified Linear Unit, also known as the ReLU function, is known to be a better activation function than the sigmoid function and the tanh function because it performs gradient descent faster. Notice in the image to the left that when x (or z) is very large, the slope is very small, which slows gradient descent significantly. This, however, is not the case for the ReLU function.