The world’s leading publication for data science, AI, and ML professionals.

Free GPUs for Training Your Deep Learning Models

Yes you heard it right. It's free.

Image by Abigail Diseno from Pixabay
Image by Abigail Diseno from Pixabay

GPUs are an essential part of training deep learning models and they don’t come cheap. In this article, we examine some platforms that provide free GPUs without the restrictions of free trial periods, limited free credits or requiring a credit card during sign up.

Quick Comparison

The 3 platforms we are examining are Gradient, Google Colab and Kaggle. Here’s a quick summary of some of the hardware specs and features that each of these platform provides.

Comparison of features. (Image by author)
Comparison of features. (Image by author)

Maximum Execution Time Per Session: Maximum time your code can run before it timeout

Idle Time: Maximum time your can leave your notebook idling before it shuts down

*: 15GB free storage from Google Drive

Gradient

Gradient by Paperspace offers end to end cloud based MLOps solution. As part of its product offering Gradient offers community notebook which are public and sharable Jupyter notebooks that runs on free cloud GPUs and CPUs.

(Screenshot from Paperspace)
(Screenshot from Paperspace)

Pros

  1. 8 CPUs with 30GB RAM, highest of all 3 provides
  2. Long idle time between 1–6 hours

Cons

  1. No private notebook if running on free GPU
  2. GPU is subjected to availability
  3. Low GPU Memory of 8GB
  4. Low free storage space of 5GB

Gradient has different pricing tier which allows for different levels of CPU / GPU instance types. Let’s take a look at the free tier.

1. Sign Up For An Account

Singing up for Gradient is hassle free with 1 click sign up.

(Screenshot from Paperspace)
(Screenshot from Paperspace)

2. Create a notebook

  1. Name your project
  2. Select a runtime
(Screenshot from Paperspace)
(Screenshot from Paperspace)
  1. Select a GPU machine. Be sure to select those that are tagged "Free". Notebooks that run on free GPUs will be public and will auto shutdown after maximum of 6 hours.
(Screenshot from Paperspace)
(Screenshot from Paperspace)

The free tier plan for hobbyist and students only allows Quadro M4000 GPU to be selected. Free GPUs are also subjected to availability and may not be available during peak periods.

3. Launch Jupyter Lab

Starting the notebook brings us to a workspace which allows us to launch Jupyter Lab. Let’s check out the GPU and CPU specs with using Jupyter Lab’s terminal.

nvidia-smi

The free version offers a Quadro M4000 GPU with 8GB RAM

(Screenshot from Paperspace)
(Screenshot from Paperspace)

lscpu

Gradient free GPU machine also comes with 8 Intel Xeon E5–2623 CPUs with 30GB RAM.

(Screenshot from Paperspace)
(Screenshot from Paperspace)

cat /proc/meminfo

(Screenshot from Paperspace)
(Screenshot from Paperspace)

Google Colab

Google Collaboratory launched by Google is a Jupyter Notebook IDE with access to free GPU and TPU. All you need is a Google Account to get started. Google Colab allows you to mount your Google Drive as a storage folder for your Colab projects, this makes reading and saving data a breeze. And yes, you can also get Google Drive for free which comes with 15GB of storage

Pros

  1. Integration with Google Drive for data storage
  2. No usage limit
  3. Allows both private and public notebooks
  4. Long execution time of 12 hours

Cons

  1. Short idle time of ~ 30 minutes
  2. Need to remount and authenticate Google Drive before every session

To use GPU in Colab, select GPU for hardware accelerator in the top ribbon RuntimeChange Runtime Type

(Screenshot from Google Colab)
(Screenshot from Google Colab)

According to Google Colab’s FAQ, Colab offers a variety of GPUs such as Nvidia K80s, T4s, P4s and P100s, however you will not be able to select specific GPU types. Let’s take a look at some of the hardware specs that Colab has to offer.

nvidia-smi

(Screenshot from Google Colab)
(Screenshot from Google Colab)

I was provisioned with Tesla K80 with ~12GB Memory for this particular session.

lscpu

(Screenshot from Google Colab)
(Screenshot from Google Colab)

Colab also offers 2 Intel Xenon 2.3GHz CPU with 13 GB RAM.

cat /proc/meminfo

(Screenshot from Google Colab)
(Screenshot from Google Colab)

Kaggle

Kaggle is a Data Science community space known for hosting top tier Machine Learning and Deep Learning competitions. It offers Kaggle Notebooks which are sharable Jupyter Notebooks backed by free GPU and CPUs.

Kaggle offers dynamic GPU quota of 30 hours at minimum which resets every Saturday midnight UTC time. I usually get between 30 to 41 hours of GPU quota per week. One of the things I like about Kaggle Notebooks is its commit mode which allows users to run the entire notebook in the background. This means that we do not have to keep the browser open while the code is running. Kaggle also allows versioning of the notebook by simply doing a quick save or a run the notebook on commit mode. This allows us to view previous notebook versions and track our progress.

Pros

  1. Commit mode enables code to run in the background
  2. Allow other users to comment on notebooks
  3. 16GB Gpu memory, the highest of all 3 providers
  4. Allow both pubic and private notebooks
  5. Access to Kaggle datasets
  6. Notebook versioning

Cons

  1. Weekly dynamic GPU quota of 30~40 hours
  2. Short idle time of 20 minutes in interactive GPU mode

Let’s take a look at some hardware specs that Kaggle has to offer.

nvidia-smi

(Screenshot from Kaggle)
(Screenshot from Kaggle)

Kaggle offers a P100 GPU with 16GB memory.

lscpu

(Screenshot from Kaggle)
(Screenshot from Kaggle)

It also offers 2 Intel Xenon CPU @ 2.00GHz with maximum of 13GB RAM

Conclusion

We reviewed 3 different platforms that provides GPU enabled Jupyter notebook for free. These platforms were selected as they do not require a credit card for sign up, free credit limit restrictions or trial period.

Kaggle is my go to platform if I am intending to create sharable notebooks. In my previous article on training a deep learning sentiment analysis model I utilized Kaggle Notebooks for both training and inference, feel free to check it out. The GPU quota per week might be a downer for heavy GPU users. Kaggle has provided some tips on how to use GPUs more efficiently. Feel free to share your experiences and alternative platforms.

References:



Related Articles