ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2401.00793
38
9

SecFormer: Towards Fast and Accurate Privacy-Preserving Inference for Large Language Models

1 January 2024
Jinglong Luo
Yehong Zhang
Zhuo Zhang
Jiaqi Zhang
Xin Mu
Hui Wang
Yue Yu
Zenglin Xu
ArXivPDFHTML
Abstract

With the growing use of large language models hosted on cloud platforms to offer inference services, privacy concerns are escalating, especially concerning sensitive data like investment plans and bank account details. Secure Multi-Party Computing (SMPC) emerges as a promising solution to protect the privacy of inference data and model parameters. However, the application of SMPC in Privacy-Preserving Inference (PPI) for large language models, particularly those based on the Transformer architecture, often leads to considerable slowdowns or declines in performance. This is largely due to the multitude of nonlinear operations in the Transformer architecture, which are not well-suited to SMPC and difficult to circumvent or optimize effectively. To address this concern, we introduce an advanced optimization framework called SecFormer, to achieve fast and accurate PPI for Transformer models. By implementing model design optimization, we successfully eliminate the high-cost exponential and maximum operations in PPI without sacrificing model performance. Additionally, we have developed a suite of efficient SMPC protocols that utilize segmented polynomials, Fourier series and Goldschmidt's method to handle other complex nonlinear functions within PPI, such as GeLU, LayerNorm, and Softmax. Our extensive experiments reveal that SecFormer outperforms MPCFormer in performance, showing improvements of 5.6%5.6\%5.6% and 24.2%24.2\%24.2% for BERTBASE_{\text{BASE}}BASE​ and BERTLARGE_{\text{LARGE}}LARGE​, respectively. In terms of efficiency, SecFormer is 3.56 and 3.58 times faster than Puma for BERTBASE_{\text{BASE}}BASE​ and BERTLARGE_{\text{LARGE}}LARGE​, demonstrating its effectiveness and speed.

View on arXiv
Comments on this paper