Making Sense of Shapley Values
Published in
9 min readOct 27, 2019
The first time I heard about Shapley values was when I was reading up on model interpretability. I came across SHAP, a framework for better understanding why your machine learning model behaves the way it does. It turns out that Shapley values have been around for a while, they first originated in the field of game theory all the way back in 1953 with the purpose of solving the following scenario: