ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.02797
31
0

Spline-based Transformers

3 April 2025
Prashanth Chandran
Agon Serifi
Markus Gross
Moritz Bächer
ArXivPDFHTML
Abstract

We introduce Spline-based Transformers, a novel class of Transformer models that eliminate the need for positional encoding. Inspired by workflows using splines in computer animation, our Spline-based Transformers embed an input sequence of elements as a smooth trajectory in latent space. Overcoming drawbacks of positional encoding such as sequence length extrapolation, Spline-based Transformers also provide a novel way for users to interact with transformer latent spaces by directly manipulating the latent control points to create new latent trajectories and sequences. We demonstrate the superior performance of our approach in comparison to conventional positional encoding on a variety of datasets, ranging from synthetic 2D to large-scale real-world datasets of images, 3D shapes, and animations.

View on arXiv
@article{chandran2025_2504.02797,
  title={ Spline-based Transformers },
  author={ Prashanth Chandran and Agon Serifi and Markus Gross and Moritz Bächer },
  journal={arXiv preprint arXiv:2504.02797},
  year={ 2025 }
}
Comments on this paper