ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.14259
36
0

Quantization-Free Autoregressive Action Transformer

18 March 2025
Ziyad Sheebaelhamd
Michael Tschannen
Michael Muehlebach
Claire Vernade
ArXivPDFHTML
Abstract

Current transformer-based imitation learning approaches introduce discrete action representations and train an autoregressive transformer decoder on the resulting latent code. However, the initial quantization breaks the continuous structure of the action space thereby limiting the capabilities of the generative model. We propose a quantization-free method instead that leverages Generative Infinite-Vocabulary Transformers (GIVT) as a direct, continuous policy parametrization for autoregressive transformers. This simplifies the imitation learning pipeline while achieving state-of-the-art performance on a variety of popular simulated robotics tasks. We enhance our policy roll-outs by carefully studying sampling algorithms, further improving the results.

View on arXiv
@article{sheebaelhamd2025_2503.14259,
  title={ Quantization-Free Autoregressive Action Transformer },
  author={ Ziyad Sheebaelhamd and Michael Tschannen and Michael Muehlebach and Claire Vernade },
  journal={arXiv preprint arXiv:2503.14259},
  year={ 2025 }
}
Comments on this paper