Netflix Data Science Interview Practice Problems

A walkthrough of some of Netflix’s interview questions!

Terence Shin, MSc, MBA
Towards Data Science

Photo by freestocks on Unsplash

Netflix is one of the most elite tech companies in the world, so it’s no surprise that their data science interview questions are much more challenging. Below are several interview questions that have been previously asked in Netflix’ data science interviews and my attempts at answering them.

Q: Why is Rectified Linear Unit a good activation function?

Created by author

The Rectified Linear Unit, also known as the ReLU function, is known to be a better activation function than the sigmoid function and the tanh function because it performs gradient descent faster. Notice in the image to the left that when x (or z) is very large, the slope is very small, which slows gradient descent significantly. This, however, is not the case for the ReLU function.

Q: What is the use of regularization? What are the differences between L1 and L2 regularization?

Create an account to read the full story.

The author made this story available to Medium members only.
If you’re new to Medium, create a new account to read this story on us.

Or, continue in mobile web

Already have an account? Sign in

No responses yet

What are your thoughts?

Recommended from Medium

Lists

See more recommendations