ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.22059
35
0

Low Rank and Sparse Fourier Structure in Recurrent Networks Trained on Modular Addition

28 March 2025
Akshay Rangamani
ArXivPDFHTML
Abstract

Modular addition tasks serve as a useful test bed for observing empirical phenomena in deep learning, including the phenomenon of \emph{grokking}. Prior work has shown that one-layer transformer architectures learn Fourier Multiplication circuits to solve modular addition tasks. In this paper, we show that Recurrent Neural Networks (RNNs) trained on modular addition tasks also use a Fourier Multiplication strategy. We identify low rank structures in the model weights, and attribute model components to specific Fourier frequencies, resulting in a sparse representation in the Fourier space. We also show empirically that the RNN is robust to removing individual frequencies, while the performance degrades drastically as more frequencies are ablated from the model.

View on arXiv
@article{rangamani2025_2503.22059,
  title={ Low Rank and Sparse Fourier Structure in Recurrent Networks Trained on Modular Addition },
  author={ Akshay Rangamani },
  journal={arXiv preprint arXiv:2503.22059},
  year={ 2025 }
}
Comments on this paper