Assemble an amazing deep learning machine at home for less than $1,500

Courtney Perigo
Towards Data Science
10 min readAug 26, 2017

--

Over the past year and a half, I’ve been exploring applications of deep learning and machine learning for a variety of clients. As we stood up these solutions using cloud computing resources, I felt this need to practice my skills outside of the corporate environment. Sure, I could go ahead and use Amazon to spin up some GPU powered EC2 instances whenever I wanted to learn… But to really dive into this world of deep learning, I felt like I had to do two things 1) give myself a way to explore on my own, at my own pace and with my own equipment. 2) Put some skin in the game by investing in some equipment that forced me to make it work or lose my big investment in myself.

Simply put, I had an itch to build my own high powered machine that I could use any time I needed, plus it doubled as a pretty nice home computer for the occasional home computing task. (At least, that’s my justification.)

This blog post represents proof that you too can build a deep learning machine very capable of handling a variety of deep learning tasks. I’ll cover everything from components to assembly and setup.

Let’s get started.

Picking the components

I have to admit, I was super intimidated by all the choices. Thankfully, there are some pretty helpful resources for choosing the right components for your deep learning computer. PC Part Picker helped keep me straight on what was needed and if there were any compatibility issues.

NOTE: If you’re uncomfortable with modifications, please don’t choose my build. I had to do some special modification to the heat sink to fit in its spot. This included cutting the heat sink with a metal cutting blade on my circular saw. Thankfully, I had the tools lying around to make the adjustment.

Here’s my complete parts list: https://pcpartpicker.com/list/ZKPxqk

Motherboard — $125

For the motherboard, all I needed was to support one GPU and make sure it could support the latest chip from Intel (Kaby Lake — i5–7600K.) The Gigabyte — GA-B250N-Phoenix WIFI Mini ITX LGA1151 Motherboard fit the bill, and it has WIFI built in and a pretty cool LED feature — so you can still impress your gamer friends.

Case — $90

Cases are inexpensive, but the size is important because we have some pretty large components to add. Make sure the case you choose can fit all of your components, and can accommodate a Mini-ITX form factor. The case I went with, Corsair — Air 240 MicroATX Mid Tower Case, has a ton of fans and the layout was pretty nice for my build.

CPU — $230

Our neural network will be using the GPU to do most of the heavy computations. In reality, the CPU could be an afterthought. Unfortunately, I’m pretty “boujee” and wanted to get a pretty significant processor. The Intel — Core i5–7600K 3.8GHz Quad-Core Processor fit the bill for me. First, it’s overclocked so I feel like a badass. Feel free to downgrade this, but if you do… make sure your motherboard supports any processor you choose. So far, this processor has performed as expected. Amazingly.

RAM — $155

RAM is cheap. Get a lot of it. Get the most you can. I haven’t been disappointed with any RAM upgrade I’ve ever purchased for my PCs. In my case, I purchased a huge 16GB of RAM. My choice, the Corsair — Vengeance LED 16GB (2 x 8GB) DDR4–3200 Memory also came with the added feature of additional LEDs in your case… because you can never get enough light show action!

Heat Sink — $45

This was my BIGGEST regret for this setup. The heat sink I purchased, the Cooler Master — GeminII M4 58.4 CFM Sleeve Bearing CPU Cooler is awesome. It just doesn’t fit. I had to CUT the fins at the edge to make room for my massive RAM. If I had to do it over again, I’d probably find a better heat sink that fit my build perfectly. The mods I made have held up to date, and the fan is ultra quiet. So I’m not gonna complain too much.

Storage — $50

Okay, let’s be honest 1TB is huge. But Imagenet is also huge. If you can afford more storage go for it because when you start downloading your datasets you’ll start to run out of space. I chose the Western Digital — Blue 1TB 3.5" 5400RPM Internal Hard Drive. It offers 1TB of storage in a SATA interface — which is fine for deep learning. If you want to upgrade this, consider bigger SATA hard drives or get fancy with faster solid state drives. In hindsight, I will likely add a second SATA hard drive. I recommend you upgrade the storage space from the get go. I’m thinking about getting 2–3 TBs in the end.

Power Supply — $85

I went way overboard with my power supply, but it’s relatively cheap and I really didn’t want it to be an issue. My Cooler Master — 650W 80+ Bronze Certified Semi-Modular ATX Power Supply could power a small submarine and I won’t have any issues with my PC.

GPU — $700

This is the biggest investment in your system, so why skimp. The GPU will power all of your deep learning processing. There are many things to consider when picking out your GPU, and we’ll go over those in a second, but the one thing you need to do is buy an Nvidia card. Nvidia has put a lot of time and investment into supporting deep learning accelerated computing. Their parallel computing platform, CUDA, is supported by almost all of the deep learning frameworks including Tensorflow, Pytorch, Caffe, et all.

So we know we’re selecting a Nvidia card, what else should we consider…

Budget: My budget was less than $1,000 on the card itself. There were lots of cards that fit the bill including the GTX 1080 Ti, GTX 1080, GTX 1070 and others. So I had a lot of potential cards to choose from. The only cards off the table were the Titan X and the yet to be released Volta.

One Vs. Multiple GPUs: I went with only one GPU due to the form factor of my chosen PC. I also plan to use this machine to explore deep learning at a surface level. I won’t really need more than one GPU for most of my use cases. Plus it seems like a pain to design a model around two cards. I’ll just keep it simple for now.

Memory: Tim Dettmers recommends that 8GB of RAM is sufficient for most home deep learning projects. That sounds reasonable to me, too. Due to that most of the 10 series cards will work. In my case, the GTX 1080 Ti at 11GB was more than sufficient.

All in all, I went with the Nvidia 1080 Ti at $699 on Amazon. A bit expensive, but worth it to get the power I was looking for in a single GPU set up.

PC TOTAL COST — $1,475.

Of course, you’ll need a monitor, keyboard and mouse to use the machine.

PC Build

If you are really uncomfortable with a computer build, outsource this part to a pro! I was feeling very industrious, so I went for it. Besides one modification (see the heatsink section of the parts list above) the build was pretty straight forward. It took me a few hours, following the manuals that came with the motherboard and other components. I also consulted with a variety of message boards to solve some human errors I made in the process.

Based on my experience, the best bet is to put the machine together outside of the case and test the parts. Once it’s tested and everything works — go ahead and install into the case.

Software Setup

Now that our computer is ready to go, it’s time to set up our software so we can dive into deep learning. For our system, I went with Ubuntu 16.04 — it’s pretty well documented; and a lot of the libraries we’re using work in a Linux environment.

1. Install Ubuntu 16.04 — Desktop

The flavor of Linux that I went with was Ubuntu, and version 16.04 is the latest, stable version to date. You can download Ubuntu 16.04 desktop here: https://www.ubuntu.com/download/desktop

I installed the operating system from a USB drive. I set it up to boot directly into the GUI; but it might be better to boot into terminal for the CUDA and GPU driver installation. After installation, go ahead and update your operating system using apt-get.

sudo add-apt-repository ppa:graphics-drivers/ppa # install graphics repository to enable removal of nvidia-381 <br>
sudo apt-get update <br>
sudo apt-get — assume-yes upgrade <br>
sudo apt-get — assume-yes install tmux build-essential gcc g++ make binutils <br>
sudo apt-get — assume-yes install software-properties-common <br>
sudo apt-get — assume-yes install git

2. Install CUDA (without driver)

Now it’s time to install some technologies to help leverage our GPU for deep learning. Our first step is to install CUDA. You can install it with the driver, but I ran into problems with compatibility with the 1080Ti — because CUDA didn’t come with the latest driver. In my case, I installed from the *.run file by downloading it directly from Nvidia.

https://developer.nvidia.com/cuda-downloads

run sudo service lightdm stop

Shut down lightdm and run your CUDA *.run file in terminal.Remember, do not install the Nvidia driver that comes with CUDA.

After installation, you need to add the following lines of code to your $PATH. You can simply run this code:

cat >> ~/.tmp << ‘EOF’
export PATH=/usr/local/cuda-8.0/bin${PATH:+:${PATH}}
export LD_LIBRARY_PATH=/usr/local/cuda-8.0/lib64\
${LD_LIBRARY_PATH:+:${LD_LIBRARY_PATH}}
EOF
source ~/.bashrc​

3. Install the Nvidia driver that supports your GPU.

In my case, it was the Nvidia 381.22 driver — the latest driver for the 1080Ti at the time of publishing.

wget http://us.download.nvidia.com/XFree86/Linux-x86_64/381.22/NVIDIA-Linux-x86_64-381.22.run
sudo sh NVIDIA-Linux-x86_64–381.22.run
sudo reboot

Alternatively, you can download and install nvidia-381 via apt-get (see below.)

sudo apt-get install nvidia-381

4. Optional: The Login Loop

If you run into the dreaded login loop, you can purge your nvidia drivers and reinstall by running the following code. I had to do this a couple of times for the installation to finally stick.

sudo apt-get remove — purge nvidia*
sudo apt-get autoremove
sudo reboot​

5. Check CUDA and Nvidia Driver Installation

Run the following lines of code to check on your current installation.

nvcc — version # CUDA check
nvidia-smi # Driver check

6. Everything okay? Good, time to install CuDNN.

To install CuDNN, you need to be a card carrying Nvidia Developer. Don’t worry, it’s easy to sign up and it’s free. The turn around for me was instantaneous; but it may take up to 48 hours for your account to be confirmed. YMMV. Once you have an account, download and install CuDNN in your CUDA directories. My example is below:

tar tar -xzf cudnn-8.0-linux-x64-v5.1.tgz
cd cuda
sudo cp lib64/* /usr/local/cuda/lib64/
sudo cp include/* /usr/local/cuda/include/

7. Your deep learning environment and tools.

The remaining parts of the install involve Tensorflow, Pytorch, Keras, Python, Conda, and any other tools you want to use for your deep learning experiments. I’ll leave that setup to you, but just remember if you use Tensorflow to download the GPU version of that software. My machine runs the following:

Python 2.7 (comes with Ubuntu)

Tensorflow for Python 2.7 and GPU — https://www.tensorflow.org/install/install_linux

Pytorch — http://pytorch.org/

OpenCV — http://opencv.org/

Let’s See it in Action!

Okay, I’ve spent $1,500 on a PC to do experiments with deep learning problems. What can this machine do? Here are some fun examples I was able to demo on my new PC.

MNIST Solution in Seconds.

I wanted to go ahead and get my feet wet by training a known MNIST solution from PyTorch. https://github.com/pytorch/examples/tree/master/mnist Here’s a video of that solution on my PC training in about 45 seconds @ 98% accuracy.

IMAGENET Training.

I wanted to try out another CNN designed to solve the Imagenet problem on Tensorflow. I trained the Imagenet Classifier from https://github.com/tensorflow/models/tree/master/tutorials/image/imagenet After training, I classified an image of my buddy, Jonathan, scuba diving at a wreck site in the Florida Keys.

Last Example: YOLO real-time Object Recognition

Finally, I wanted to really test the GPU capability by asking the GPU to recognize objects in real time from my webcam. Here is video of me getting really excited about my machine’s capabilities.

What’s Next?

Now that I have a pretty amazing deep learning capability in my house, I have a lot of experiments planned. My next projects involve more computer vision applications. I may even make a classifier for my Mom’s birthday. She was looking for an application to help her classify plants in her yard. No problem, Mom. I got you covered.

--

--

#Analytics, #Data, #MachineLearning and marketing #Research Pro | #datadriven SVP of Data Strategy @cramerkrasselt www.courtneyperigo.com