EVENT TALKS

Latent Stochastic Differential Equations

David Duvenaud | TMLS2019

TDS Editors
Towards Data Science
2 min readApr 14, 2020

--

A talk from the Toronto Machine Learning Summit: https://torontomachinelearning.com/

About the speaker:

David Duvenaud is an assistant professor in computer science and statistics at the University of Toronto. He holds a Canada Research Chair in generative models. His postdoctoral research was done at Harvard University, where he worked on hyperparameter optimization, variational inference, and chemical design. He did his Ph.D. at the University of Cambridge, studying Bayesian nonparametrics with Zoubin Ghahramani and Carl Rasmussen. David spent two summers in the machine vision team at Google Research, and also co-founded Invenia, an energy forecasting and trading company. David is a founding member of the Vector Institute and a Faculty Fellow at ElementAI.

About the talk:

Much real-world data is sampled at irregular intervals, but most time series models require regularly-sampled data. Continuous-time latent variables models can handle address this problem, but until now only deterministic models, such as latent ODEs, were efficiently trainable by backprop. We generalize the adjoint sensitivities method to SDEs, constructing an SDE that runs backwards in time and computes all necessary gradients, along with a general algorithm that allows SDEs to be trained by backpropgation with constant memory cost. We also give an efficient algorithm for gradient-based stochastic variational inference in function space, all with the use of adaptive black-box SDE solvers. Finally, we’ll show initial results of applying latent SDEs to time series data, and discuss prototypes of infinitely-deep Bayesian neural networks.

Latent Stochastic Differential Equations

--

--

Building a vibrant data science and machine learning community. Share your insights and projects with our global audience: bit.ly/write-for-tds