
MAML is a class of meta-learning algorithms created by Stanford Research and UC Berkeley Alum Dr. Chelsea Finn. MAML was inspired by the idea behind the question that how much data is really needed to learn about something. Can we teach algorithms to learn how to learn?
In such a context, there are a few challenges with conventional Machine Learning algorithms:
- Intensive training is required
- Labeled data for some problems may be limited
- The performance of the network may be sensitive to the choice of hyperparameters
In such respect, meta-learning algorithms can be designed to address the following tasks:
- Dynamic selection of inductive bias
- Building meta-rules for multi-task learning
- Learning to learn with hyperparameter optimization
Taken from Chelsea Finn’s original research:
MAML is a meta-learning algorithm that is compatible with any model trained with gradient descent algorithm and covers problems from classification, reinforcement learning (RL)and regression
What kind of problems MAML solves?
Maml is designed such that it trains a model on a variety of tasks such that it can learn a new learning task with only a small number of training samples.
A few important points of MAML are:
- MAML doesn’t expand the number of learned parameters.
- No constraint on the architecture or network of the model.
- Can be combined with other deep-learning frameworks such as Recurrent Neural Network (RNN), Convolutional Neural Network (CNN), and Multi-Layer Perceptron (MLP).
Problem Setup
Problem setup for MAML is reproduced from the original paper:


MAML introduces an outer loop called meta-training.




How to run the MAML code?
Chelsea Finn’s Github repo provides the code to reproduce the MAML results. You can use the following steps to reproduce its results:
-
I am going to create a Python virtual environment and install dependencies:
sudo apt install virtualenv virtualenv - python=python3.6 maml
source maml/bin/activate
- Next, we install dependencies:
pip install tensorflow==1.11.0 pip install image
I didn’t use the latest version of Tensorflow as MAML code was written a couple of years ago when TF2 was still not released for public use.-
Clone the MAML repo:
git clone [https://github.com/cbfinn/maml](https://github.com/cbfinn/maml)
-
Download omniglot data, for this article, I am only going to run omniglot example besides sinusoidal example:
wget [https://github.com/brendenlake/omniglot/raw/master/python/images_background.zip](https://github.com/brendenlake/omniglot/raw/master/python/images_background.zip) wget [https://github.com/brendenlake/omniglot/raw/master/python/images_evaluation.zip](https://github.com/brendenlake/omniglot/raw/master/python/images_evaluation.zip)
- Unzip images_background and images_evaluation zip files to
maml/data/omniglot
folder wheremamal
folder is the Github repo folder. The directory structure looks as follows:

- Navigate to
data
subfolder ofmaml
folder and copy the content ofomniglot
toomniglot_resized
. The run resize-image script
cd maml/data
cp -r omniglot/* omniglot_resized/
cd omniglot_resized
python resize_images.py
- Now, we go back to the root directory of maml repo and run two examples:
a. Sinusoid example:
python main.py --datasource=sinusoid --logdir=logs/sine/ --metatrain_iterations=70000 --norm=None --update_batch_size=10
b. omniglot example:
python main.py --datasource=omniglot --metatrain_iterations=60000 --meta_batch_size=32 --update_batch_size=1 --update_lr=0.4 --num_updates=1 --logdir=logs/omniglot5way/
The checkpoint of each example will be saved in log/
directory.
Update: I discovered that there is a bug in the codebase. I believe the bug might have been introduced inadvertently. In main.py
, line 160 saver.save(sess, FLAGS.logdir + '/' + exp_string + '/model' + str(itr))
, itr
variable is the for
loop iterator and I believe that saver.save
statement should be executed inside the for loop at the end of the loop and not outside the loop.
From here onwards, I believe an intermediate-level machine learning practitioner should be able to modify the pipeline to meet his/her requirements.