Weekly Selection — May 11, 2018

TDS Editors
Towards Data Science
2 min readMay 11, 2018

--

Linear Algebra for Deep Learning

by Vihar Kurama — 5 min read

Linear algebra, probability and calculus are the ‘languages’ in which machine learning is formulated. Learning these topics will contribute a deeper understanding of the underlying algorithmic mechanics and allow development of new algorithms.

Deep Learning Book Notes: Introduction to Probability

by Adrien Lucas Ecoffet — 18 min read

These are the first part of my notes for chapter 3 of the Deep Learning book. They can also serve as a quick intro to probability. These notes cover about half of the chapter (the part on introductory probability), a followup post will cover the rest (some more advanced probability and information theory).

The Logistic Regression Algorithm

by Niklas Donges — 7 min read

Logistic Regression is one of the most used Machine Learning algorithms for binary classification. It is a simple Algorithm that you can use as a performance baseline, it is easy to implement and it will do well enough in many tasks.

Building a Custom Mask RCNN model with Tensorflow Object Detection

by Priya Dwivedi — 5 min read

You can now build a custom Mask RCNN model using Tensorflow Object Detection Library! Mask RCNN is an instance segmentation model that can identify pixel by pixel location of any object.

Demystifying Generative Adversarial Networks

by Stefan Hosein — 5 min read

In this tutorial (derived from my original post here), you will learn what Generative Adversarial Networks (GANs) are without going into the details of the math. After, you will learn how to code a simple GAN which can create digits!

Deep Learning for Machine Empathy: Robots and Humans Interaction — Part I

by Nelson Fernandez — 4 min read

When we think about the imminent development of the next digital revolution, humanity will face an unprecedented wave of automation. More and more smart and connected devices will coexist with us.

Fast Near-Duplicate Image Search using Locality Sensitive Hashing

by Gal Yona — 9 min read

If you have some education in Machine Learning, the name Nearest Neighbor probably reminds you of the k-nearest neighbors algorithm. It is a very simple algorithm with seemingly no “learning” actually involved: The kNN rule simply classifies each unlabeled example by the majority label among its k-nearest neighbors in the training set.

One problem to explain why AI works

by Peter Sweeney — 20 min read

Ask your resident experts, Why does AI work? Readily, they’ll explain How it works, methods emptying in a mesmerizing jargonfall of gradient descent. But why? Why will an expensive and inscrutable machine create the knowledge I need to solve my problem?

--

--

Building a vibrant data science and machine learning community. Share your insights and projects with our global audience: bit.ly/write-for-tds