Monte Carlo Dropout

Improve your neural network for free with one small trick, getting model uncertainty estimate as a bonus.

Michał Oleszak
Towards Data Science
8 min readSep 20, 2020

--

There ain’t no such thing as a free lunch, at least according to the popular adage. Well, not anymore! Not when it comes to neural networks, that is to say. Read on to see how to improve your network’s performance with an incredibly simple yet clever trick called the Monte Carlo Dropout.

Dropout

The magic trick we are about to introduce only works if your neural network has dropout layers, so let’s kick off with briefly introducing these. Dropout boils down to simply switching-off some neurons at each training step. At each step, a different set of neurons are switched off. Mathematically speaking, each neuron has some probability p of being ignored, called the dropout rate. The dropout rate is typically set to be between 0 (no dropout) and 0.5 (approximately 50% of all neurons will be switched off). The exact value depends on the network type, layer size, and the degree to which the network overfits the training data.

A full network (left) and the same network with two neurons dropped out in a particular training step (right).

--

--

Towards Data Science
Towards Data Science

Published in Towards Data Science

Your home for data science and AI. The world’s leading publication for data science, data analytics, data engineering, machine learning, and artificial intelligence professionals.

Michał Oleszak
Michał Oleszak

Written by Michał Oleszak

ML Engineer & Manager | Top Writer in AI & Statistics | michaloleszak.com | Book 1:1 @ topmate.io/michaloleszak

Responses (10)

What are your thoughts?