32
1

Latent Space Energy-based Neural ODEs

Abstract

This paper introduces novel deep dynamical models designed to represent continuous-time sequences. Our approach employs a neural emission model to generate each data point in the time series through a non-linear transformation of a latent state vector. The evolution of these latent states is implicitly defined by a neural ordinary differential equation (ODE), with the initial state drawn from an informative prior distribution parameterized by an Energy-based model (EBM). This framework is extended to disentangle dynamic states from underlying static factors of variation, represented as time-invariant variables in the latent space. We train the model using maximum likelihood estimation with Markov chain Monte Carlo (MCMC) in an end-to-end manner. Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts, and can generalize to new dynamic parameterization, enabling long-horizon predictions.

View on arXiv
@article{cheng2025_2409.03845,
  title={ Latent Space Energy-based Neural ODEs },
  author={ Sheng Cheng and Deqian Kong and Jianwen Xie and Kookjin Lee and Ying Nian Wu and Yezhou Yang },
  journal={arXiv preprint arXiv:2409.03845},
  year={ 2025 }
}
Comments on this paper