Dear readers and contributors,
thank you for continuing to support Towards Data Science. This week we extend the theme of our monthly Data Science primer, presenting to you some of this week’s best AI posts.
We are delighted to welcome Inês Teixeira, Arun Nambiar, Cherie Chung, Sami Amegavi and Dassi Orleando to our team as Editorial Associates. We are excited to provide this support to our contributors and we hope that writers will take full advantage of their talents.
A candid conversation about machine learning with Madison May
By Ryan Louie – Reading: 8 min.
Welcome to Candid Machine Learning Conversations. I’m Ryan Louie, and I interview machine learning practitioners about their thoughtfulness behind their profession.
Computational Thinking with Reddit at the Wolfram Summer School
By swede white – Reading: 8 min.
Around this time last year, I was trying to figure out what to do with my summer. It had been a rather hellacious semester in graduate school, but I was finally done with core coursework in my doctoral program in sociology at Louisiana State University.
Backward Propagation for Feed Forward Networks
By SauceCat— Reading: 7 min.
To train the networks, a specific error function is used to measure the model performance. The goal is to minimize the error(cost) by updating the corresponding model parameters. To know which direction and how much to update the parameters, their derivatives w.r.t the error function must be calculated. And this is what backward propagation is used for.
Tracking pedestrians for self driving cars
By Priya Dwivedi – Reading: 5 min.
A self driving car needs a map of the world around it as it drives. It must be able to track pedestrians, cars, bikes and other moving objects on the road continuously. In this article I will talk through a technique called Extended Kalman Filter which is being used by Google self driving car to track moving objects on the road.