Closed-Form Continuous-Time Neural Networks

By Ramin Hasani et al
Published on March 2, 2022
Read the original document by opening this link in a new tab.

Table of Contents

Continuous-time neural processes are performant sequential decision-makers... Deriving a Closed-form Solution...

Summary

Continuous-time neural processes are powerful decision-makers built by differential equations. The paper introduces an approximate closed-form solution for the interaction between neurons and synapses in liquid time-constant networks, significantly impacting the design of neural models. The proposed closed-form networks offer faster training and inference speeds compared to differential equation-based models. They are shown to be scalable and perform well in time series modeling tasks. The document also discusses the derivation of the closed-form solution for liquid time-constant networks and its practical implementation. The results indicate that the closed-form solution closely approximates the dynamics of the original ODE system. Furthermore, a novel neural network model, inspired by the closed-form solution, is designed to address gradient and expressivity issues during optimization.
×
This is where the content will go.