ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.12822
18
0

Emergent Specialization: Rare Token Neurons in Language Models

19 May 2025
Jing Liu
Haozheng Wang
Yueheng Li
    MILM
    LRM
ArXivPDFHTML
Abstract

Large language models struggle with representing and generating rare tokens despite their importance in specialized domains. In this study, we identify neuron structures with exceptionally strong influence on language model's prediction of rare tokens, termed as rare token neurons, and investigate the mechanism for their emergence and behavior. These neurons exhibit a characteristic three-phase organization (plateau, power-law, and rapid decay) that emerges dynamically during training, evolving from a homogeneous initial state to a functionally differentiated architecture. In the activation space, rare token neurons form a coordinated subnetwork that selectively co-activates while avoiding co-activation with other neurons. This functional specialization potentially correlates with the development of heavy-tailed weight distributions, suggesting a statistical mechanical basis for emergent specialization.

View on arXiv
@article{liu2025_2505.12822,
  title={ Emergent Specialization: Rare Token Neurons in Language Models },
  author={ Jing Liu and Haozheng Wang and Yueheng Li },
  journal={arXiv preprint arXiv:2505.12822},
  year={ 2025 }
}
Comments on this paper