Making Sense of Shapley Values

Marko Cotra
Towards Data Science
9 min readOct 27, 2019

--

Image credit: Iker Urteaga at https://unsplash.com/photos/TL5Vy1IM-uA

The first time I heard about Shapley values was when I was reading up on model interpretability. I came across SHAP, a framework for better understanding why your machine learning model behaves the way it does. It turns out that Shapley values have been around for a while, they first originated in the field of game theory all the way back in 1953 with the purpose of solving the following scenario:

--

--

Engineer with a passion for probability theory, software development and leadership.