ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2201.10838
72
13
v1v2v3v4v5v6 (latest)

Privacy-Preserving Logistic Regression Training with A Faster Gradient Variant

26 January 2022
Jonathan Z. Chiang
ArXiv (abs)PDFHTML
Abstract

Logistic regression training over encrypted data has been an attractive idea to security concerns for years. In this paper, we propose a faster gradient variant called quadratic gradient\texttt{quadratic gradient}quadratic gradient for privacy-preserving logistic regression training. The core of quadratic gradient\texttt{quadratic gradient}quadratic gradient can be seen as an extension of the simplified fixed Hessian. We enhance Nesterov's accelerated gradient (NAG) and Adaptive Gradient Algorithm (Adagrad) respectively with quadratic gradient\texttt{quadratic gradient}quadratic gradient and evaluate the enhanced algorithms on several datasets. %gradient ascentascentascent methods with this gradient variant on the gene dataset provided by the 2017 iDASH competition and other datasets. Experiments show that the enhanced methods have a state-of-the-art performance in convergence speed compared to the raw first-order gradient methods. We then adopt the enhanced NAG method to implement homomorphic logistic regression training, obtaining a comparable result by only 333 iterations. There is a promising chance that quadratic gradient\texttt{quadratic gradient}quadratic gradient could be used to enhance other first-order gradient methods for general numerical optimization problems.

View on arXiv
Comments on this paper