Fast.ai: The TOC of Practical Deep Learning — Part 1

If the USF Data Institute certificate and MOOC from Fast.ai were a book, this would be its table of contents

Jason I. Carter
Towards Data Science

--

I’m currently on a journey to turn my interest and fascination of deep learning into a career. Coming from a software engineering background, I’m a little ahead of the curve than other non-PhD/MS people but not by much.

You can read up on my job quitting and career pivoting posts here on Medium and an updated commentary version on LinkedIn.

Blogging is a big part of my journey, particularly, keeping track of what I’m actually learning. Not just for myself in the moment, but for anyone following similar footsteps. So this post is for those interested in taking, or wondering, what is taught in the Fast.ai and the USF Data Institute: Deep Learning program.

This post was also partly inspired by a tweet alluding to the excellent work Rachel Thomas and Jeremy Howard are doing as being dumbed down.

Part 10 of Rachel’s response (read all 11 parts, it’s great!):

Part 1

Chapter 1: Image Recognition

  • Setup Your Own AWS Instance
  • Introduction to Deep Learning and Why is it Useful
  • 7 Lines of Code to Deep Learning
  • Breaking Down VGG16
  • Enter a Kaggle Competition: Place top 50% in dogs vs. cats redux

Chapter 2: CNNs

  • Dogs vs. Cats Redux: State-of-the-art model explained
  • Pre-trained Networks vs Random weights
  • The How and Why of Fine-Tuning
  • Dense Layers, Stochastic Gradient Descent and Activation Functions

Chapter 3: Understanding Over and Under Fitting

  • CNNs at Work: From dot product to convs and back(propagation) again
  • Dealing with Over and Under Fitting
  • CNN Regularization Techniques
  • Semi-supervised Learning, Pseudo Labeling and Knowledge Distillation
  • Convolutions and SGD in Excel

Chapter 4: Collaborative Filtering, Embeddings, and More

  • Embeddings in Excel
  • Recommendation Systems and Collaborative Filtering
  • Latent Factors
  • Build More Models

Chapter 5: An Introduction to NLPs and RNNs

  • Keras Functional API
  • Multi-sized CNNs
  • Sentiment Analysis
  • NLP Transfer Learning
  • Visualizing Neural Network as Computational Graphs
  • Introduction to RNNs

Chapter 6: RNNs from Scratch, Embeddings and Theano APIs

  • Embeddings for RNNs in Excel
  • Building RNN models from Scratch with Keras
  • Building RNNs with Theano APIs

Chapter 7: CNN Architectures, Bounding Boxes and Multiple Inputs

  • An Overview of ResNet and Inception CNNs
  • Data Leakage
  • CNNs: How to deal with multiple inputs, outputs and bounding boxes
  • Fully Convolutional Networks

Summary

Quite a lot is covered in Part 1 of this program. For each “chapter”, there’s a 2 hour video, accompanying Jupyter notebook(s), notes/wiki (written/updated mostly by students), suggested readings (blogs, papers, etc) by Jeremy, assignments (not marked but recommended practice work) and an amazing active forums of in-person and online students.

The teaching style is a “top-down” approach, which you can read more about here. Personally, I think it’s great! The same way I don’t need to know assembly/machine code in order to write Python, I shouldn’t need to know the intricacies of every math formula in order to utilize deep learning techniques.

But don’t be mistaken, there IS math, formulas, complicated concepts to understand and dirt dry academic papers to read if that’s your thing. It’s the approach that’s different/better which is so intriguing to many people. And like any course or program, online or in college, the effort you put in matters.

As for me, I’ve successfully completed Part 1 and feel much smarter than I did month ago. I’m not an expert in deep learning (yet!) but I certainly have more knowledge and practical experience than a number of my colleagues because of Fast.ai.

Now, I’m working my way through Part 2, and hopefully in about 4 weeks, you’ll be ready for my follow up post.

Thoughts? Questions about the program? Feel free to ask your questions in the comment section below.

--

--