Liquid Time-constant Networks

By Ramin Hasani et al
Read the original document by opening this link in a new tab.

Table of Contents

Abstract
1 Introduction
2 LTCs forward-pass by a fused ODE solvers
3 Training LTC networks by BPTT
4 Bounds on and neural state of LTCs
5 On the expressive power of LTCs
5.1 Measuring expressivity by trajectory length

Summary

We introduce a new class of time-continuous recurrent neural network models called Liquid Time-Constant Networks (LTCs). These networks model dynamical systems with varying time-constants coupled to their hidden state. They exhibit stable and bounded behavior, with superior expressivity within the family of neural ordinary differential equations. The paper discusses the theoretical foundations, computational aspects, and training methods for LTCs. It also presents experimental results comparing LTCs to classical and modern RNNs on time-series prediction tasks. The study demonstrates the approximation capability and performance improvements of LTCs. Additionally, the paper discusses the structural properties and bounds on the time-constant and neural state of LTCs. Expressivity of LTCs is analyzed in comparison to other time-continuous models, emphasizing the trajectory length as a measure of complexity in spatiotemporal data processing.
×
This is where the content will go.