
GPUs are an essential part of training deep learning models and they don’t come cheap. In this article, we examine some platforms that provide free GPUs without the restrictions of free trial periods, limited free credits or requiring a credit card during sign up.
Quick Comparison
The 3 platforms we are examining are Gradient, Google Colab and Kaggle. Here’s a quick summary of some of the hardware specs and features that each of these platform provides.

Maximum Execution Time Per Session
: Maximum time your code can run before it timeout
Idle Time
: Maximum time your can leave your notebook idling before it shuts down
*
: 15GB free storage from Google Drive
Gradient
Gradient by Paperspace offers end to end cloud based MLOps solution. As part of its product offering Gradient offers community notebook which are public and sharable Jupyter notebooks that runs on free cloud GPUs and CPUs.

Pros
- 8 CPUs with 30GB RAM, highest of all 3 provides
- Long idle time between 1–6 hours
Cons
- No private notebook if running on free GPU
- GPU is subjected to availability
- Low GPU Memory of 8GB
- Low free storage space of 5GB
Gradient has different pricing tier which allows for different levels of CPU / GPU instance types. Let’s take a look at the free tier.
1. Sign Up For An Account
Singing up for Gradient is hassle free with 1 click sign up.

2. Create a notebook
- Name your project
- Select a runtime

- Select a GPU machine. Be sure to select those that are tagged "Free". Notebooks that run on free GPUs will be public and will auto shutdown after maximum of 6 hours.

The free tier plan for hobbyist and students only allows Quadro M4000 GPU to be selected. Free GPUs are also subjected to availability and may not be available during peak periods.
3. Launch Jupyter Lab
Starting the notebook brings us to a workspace which allows us to launch Jupyter Lab. Let’s check out the GPU and CPU specs with using Jupyter Lab’s terminal.
nvidia-smi
The free version offers a Quadro M4000 GPU with 8GB RAM

lscpu
Gradient free GPU machine also comes with 8 Intel Xeon E5–2623 CPUs with 30GB RAM.

cat /proc/meminfo

Google Colab
Google Collaboratory launched by Google is a Jupyter Notebook IDE with access to free GPU and TPU. All you need is a Google Account to get started. Google Colab allows you to mount your Google Drive as a storage folder for your Colab projects, this makes reading and saving data a breeze. And yes, you can also get Google Drive for free which comes with 15GB of storage
Pros
- Integration with Google Drive for data storage
- No usage limit
- Allows both private and public notebooks
- Long execution time of 12 hours
Cons
- Short idle time of ~ 30 minutes
- Need to remount and authenticate Google Drive before every session
To use GPU in Colab, select GPU for hardware accelerator in the top ribbon Runtime
→ Change Runtime Type

According to Google Colab’s FAQ, Colab offers a variety of GPUs such as Nvidia K80s, T4s, P4s and P100s, however you will not be able to select specific GPU types. Let’s take a look at some of the hardware specs that Colab has to offer.
nvidia-smi

I was provisioned with Tesla K80 with ~12GB Memory for this particular session.
lscpu

Colab also offers 2 Intel Xenon 2.3GHz CPU with 13 GB RAM.
cat /proc/meminfo

Kaggle
Kaggle is a Data Science community space known for hosting top tier Machine Learning and Deep Learning competitions. It offers Kaggle Notebooks which are sharable Jupyter Notebooks backed by free GPU and CPUs.
Kaggle offers dynamic GPU quota of 30 hours at minimum which resets every Saturday midnight UTC time. I usually get between 30 to 41 hours of GPU quota per week. One of the things I like about Kaggle Notebooks is its commit mode which allows users to run the entire notebook in the background. This means that we do not have to keep the browser open while the code is running. Kaggle also allows versioning of the notebook by simply doing a quick save or a run the notebook on commit mode. This allows us to view previous notebook versions and track our progress.
Pros
- Commit mode enables code to run in the background
- Allow other users to comment on notebooks
- 16GB Gpu memory, the highest of all 3 providers
- Allow both pubic and private notebooks
- Access to Kaggle datasets
- Notebook versioning
Cons
- Weekly dynamic GPU quota of 30~40 hours
- Short idle time of 20 minutes in interactive GPU mode
Let’s take a look at some hardware specs that Kaggle has to offer.
nvidia-smi

Kaggle offers a P100 GPU with 16GB memory.
lscpu

It also offers 2 Intel Xenon CPU @ 2.00GHz with maximum of 13GB RAM
Conclusion
We reviewed 3 different platforms that provides GPU enabled Jupyter notebook for free. These platforms were selected as they do not require a credit card for sign up, free credit limit restrictions or trial period.
Kaggle is my go to platform if I am intending to create sharable notebooks. In my previous article on training a deep learning sentiment analysis model I utilized Kaggle Notebooks for both training and inference, feel free to check it out. The GPU quota per week might be a downer for heavy GPU users. Kaggle has provided some tips on how to use GPUs more efficiently. Feel free to share your experiences and alternative platforms.
References:
- https://gradient.paperspace.com/pricing
- https://research.google.com/colaboratory/faq.html
- https://www.kaggle.com/docs/notebooks
- Join Medium to read more stories like this.