[ Paper Summary ] Google Deep Mind — Towards an integration of deep learning and neuroscience — Part 1

Jae Duk Seo
Towards Data Science
4 min readApr 3, 2018

--

Gif from here

Google, Deep Mind, the best of the best. One of the most innovative AI research lab among our generation. They have amazing publications. Today, I’ll try to do a one-two sentence summary on their paper “ Towards an integration of deep learning and neuroscience”. Also, please note that this paper is 60 pages, so I will need to divide this summary into several posts.

Please note this post is for my future self to look back.

Abstract

Thanks to the development of more complex machine learning models as well as sophisticated cost function, now it maybe possible to bridge the gap between machine learning and neuroscience .

Introduction

Again the author states that there has been a disconnection between neuroscience and computer science. And continuous on to state that three development in machine learning 1)Cost Function Optimization 2) Development of more Complex Cost Function 3) Development of more Complex Network Architecture can now help bridge that gap, and introduces three hypotheses.

1.1 Hypothesis 1 — The brain optimizes cost functions.

One example of optimization among humans can be simply moving, the author argues that humans have evolved to optimize energy consumption while maximizing financial and movement gains. This mechanism can be applied to machines as well.

1.2 Hypothesis 2 — Cost functions are diverse across areas and change over development.

The main idea here is that the cost function does not have to be global. Our brain might use the mean square error for optimizing energy consumption while moving. But the same brain might use cost function similar to softmax with cross entropy loss for classification. This mix of specialized cost function can make learning more efficient.

1.3 Hypothesis 3 — Specialized systems allow efficiently solving key computational problems.

This part is quite long, but in essence….
1. Different part of the brain do different thing (e.g Thalamus part of the brain may act as an information routing gate while areas like basal ganglia help making discrete decisions) hence development of different structure of machine learning model are critical (LSTM, RNN, CNN etc…) for efficiency.

2. However, human brain are very special since, even if the world gives us limited amount of labeled data we are able to learn the differences and correctly differentiate from one another. None of existing unsupervised learning algorithms can do what we humans do. (As of today)

3. Hence, it is very important to find the right sequence of cost functions that an unsupervised model can be trained on to provide the right behavior.

2. The brain can optimize cost functions

The author claims that we can view Brain as an optimization machine, where it can have variety of cost function for different tasks. While optimizing each cost function in the learning procedure. Some of those complex cost function can be back propagation, spike timing dependent plasticity, dendritic computation, and local excitatory-inhibitory networks, or any other optimization function which can result in much more powerful learning procedure than any optimization algorithms.

Final Words

For now, I’ll stop here, since I have few assignments to due tmrw, but will come back to to this paper very soon. Sorry for the short posting.

Reference

  1. Marblestone, A. H., Wayne, G., & Kording, K. P. (2016). Toward an integration of deep learning and neuroscience. Frontiers in Computational Neuroscience, 10, 94.
  2. Basal ganglia. (2018). En.wikipedia.org. Retrieved 3 April 2018, from https://en.wikipedia.org/wiki/Basal_ganglia
  3. Dr Ananya Mandal, M. (2010). What is the Thalamus?. News-Medical.net. Retrieved 3 April 2018, from https://www.news-medical.net/health/What-is-the-Thalamus.aspx
  4. Spike-timing-dependent plasticity. (2018). En.wikipedia.org. Retrieved 3 April 2018, from https://en.wikipedia.org/wiki/Spike-timing-dependent_plasticity
  5. (2018). Neurophysics.ucsd.edu. Retrieved 3 April 2018, from https://neurophysics.ucsd.edu/courses/physics_171/annurev.neuro.28.061604.135703.pdf
  6. (2018). Ocw.mit.edu. Retrieved 3 April 2018, from https://ocw.mit.edu/courses/brain-and-cognitive-sciences/9-641j-introduction-to-neural-networks-spring-2005/lecture-notes/lec14_excinh.pdf

--

--

Exploring the intersection of AI, deep learning, and art. Passionate about pushing the boundaries of multi-media production and beyond. #AIArt