ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.06839
58
0

AttFC: Attention Fully-Connected Layer for Large-Scale Face Recognition with One GPU

10 March 2025
Zhuowen Zheng
Yain-Whar Si
Xiaochen Yuan
Junwei Duan
Ke Wang
Xiaofan Li
Xinyuan Zhang
Xueyuan Gong
    CVBM
ArXivPDFHTML
Abstract

Nowadays, with the advancement of deep neural networks (DNNs) and the availability of large-scale datasets, the face recognition (FR) model has achieved exceptional performance. However, since the parameter magnitude of the fully connected (FC) layer directly depends on the number of identities in the dataset. If training the FR model on large-scale datasets, the size of the model parameter will be excessively huge, leading to substantial demand for computational resources, such as time and memory. This paper proposes the attention fully connected (AttFC) layer, which could significantly reduce computational resources. AttFC employs an attention loader to generate the generative class center (GCC), and dynamically store the class center with Dynamic Class Container (DCC). DCC only stores a small subset of all class centers in FC, thus its parameter count is substantially less than the FC layer. Also, training face recognition models on large-scale datasets with one GPU often encounter out-of-memory (OOM) issues. AttFC overcomes this and achieves comparable performance to state-of-the-art methods.

View on arXiv
@article{zheng2025_2503.06839,
  title={ AttFC: Attention Fully-Connected Layer for Large-Scale Face Recognition with One GPU },
  author={ Zhuowen Zheng and Yain-Whar Si and Xiaochen Yuan and Junwei Duan and Ke Wang and Xiaofan Li and Xinyuan Zhang and Xueyuan Gong },
  journal={arXiv preprint arXiv:2503.06839},
  year={ 2025 }
}
Comments on this paper