Weekly Selection — Apr 6, 2018

TDS Editors
Towards Data Science
3 min readApr 6, 2018

--

Hyper-parameters in Action! Part I — Activation Functions

by Daniel Godoy — 11 min read

Deep Learning is all about hyper-parameters! Maybe this is an exaggeration, but having a sound understanding of the effects of different hyper-parameters on training a deep neural network is definitely going to make your life easier.

Semi-Supervised Learning and GANs

by Raghav Mehta — 5 min read

Most deep learning classifiers require a large amount of labeled samples to generalize well, but getting such data is an expensive and difficult process. To deal with this limitation Semi-supervised learning is presented, which is a class of techniques that make use of a morsel of labeled data along with a large amount of unlabeled data.

Save Lives With 10 Lines of Code: Detecting Parkinson’s with XGBoost

by Priansh Shah — 3 min read

So you’ve dabbled in data science and have heard the term “XGBoost” thrown around a bit, but don’t know what it is. I’m a big fan of learning by doing, so let’s try using XGBoost on a real-life problem: diagnosing Parkinson’s.

Visualizing Beethoven’s Oeuvre, Part I: Scraping and cleaning data from IMSLP

by Michael Zhang — 10 min read

This article is the first part in a short tutorial series I’m writing to document my progress in a personal side project where I’m hoping to analyze and visualize the complete works of Beethoven. The goal of this project is to explore the connection between music and emotion, while also experimenting with different ways of visualizing music data, especially with regard to color.

Data Visualization with Bokeh in Python, Part III: Making a Complete Dashboard

by William Koehrsen — 10 min read

Sometimes I learn a data science technique to solve a specific problem. Other times, as with Bokeh, I try out a new tool because I see some cool projects on Twitter and think: “That looks pretty neat.

How to build a Neural Network with Keras

by Niklas Donges — 9 min read

Keras is one of the most popular Deep Learning libraries out there at the moment and made a big contribution to the commoditization of artificial intelligence. It is simple to use and it enables you to build powerful Neural Networks in just a few lines of code.

The Variational Autoencoder as a Two-Player Game — Part I

by Max Frenzel — 17 min read

The field of AI, and particularly the sub-field of Deep Learning, has been exploding with progress in the past few years. One particular approach, generative models, has been responsible for much of this progress.

Paper repro: Deep Metalearning using “MAML” and “Reptile”

by Adrien Lucas Ecoffet — 8 min read

In this post I reproduce two recent papers in the field of metalearning: MAMLand the similar Reptile. The full notebook for this reproduction can be found here.

--

--

Building a vibrant data science and machine learning community. Share your insights and projects with our global audience: bit.ly/write-for-tds