Neural Ordinary Differential Equations

By Ricky T. Q. Chen et al
Read the original document by opening this link in a new tab.

Table of Contents

1 Introduction
2 Reverse-mode automatic differentiation of ODE solutions
3 Replacing residual networks with ODEs for supervised learning
4 Continuous Normalizing Flows
4.1 Experiments with Continuous Normalizing Flows
5 A generative latent function time-series model

Summary

Neural Ordinary Differential Equations is a paper by Ricky T. Q. Chen et al from the University of Toronto and Vector Institute. The paper introduces a new family of deep neural network models that parameterize the derivative of the hidden state using a neural network. These continuous-depth models have constant memory cost, adapt their evaluation strategy to each input, and can explicitly trade numerical precision for speed. The paper demonstrates properties of continuous-depth residual networks, continuous-time latent variable models, and continuous normalizing flows. It also discusses the training of ODEs within larger models and the benefits of using ODE solvers in neural networks.
×
This is where the content will go.