ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.06375
28
10

One Transformer for All Time Series: Representing and Training with Time-Dependent Heterogeneous Tabular Data

13 February 2023
Simone Luetto
Fabrizio Garuti
E. Sangineto
L. Forni
Rita Cucchiara
    LMTD
    AI4TS
ArXivPDFHTML
Abstract

There is a recent growing interest in applying Deep Learning techniques to tabular data, in order to replicate the success of other Artificial Intelligence areas in this structured domain. Specifically interesting is the case in which tabular data have a time dependence, such as, for instance financial transactions. However, the heterogeneity of the tabular values, in which categorical elements are mixed with numerical items, makes this adaptation difficult. In this paper we propose a Transformer architecture to represent heterogeneous time-dependent tabular data, in which numerical features are represented using a set of frequency functions and the whole network is uniformly trained with a unique loss function.

View on arXiv
@article{luetto2025_2302.06375,
  title={ One Transformer for All Time Series: Representing and Training with Time-Dependent Heterogeneous Tabular Data },
  author={ Simone Luetto and Fabrizio Garuti and Enver Sangineto and Lorenzo Forni and Rita Cucchiara },
  journal={arXiv preprint arXiv:2302.06375},
  year={ 2025 }
}
Comments on this paper