47
0

FUSE: First-Order and Second-Order Unified SynthEsis in Stochastic Optimization

Abstract

Stochastic optimization methods have actively been playing a critical role in modern machine learning algorithms to deliver decent performance. While numerous works have proposed and developed diverse approaches, first-order and second-order methods are in entirely different situations. The former is significantly pivotal and dominating in emerging deep learning but only leads convergence to a stationary point. However, second-order methods are less popular due to their computational intensity in large-dimensional problems. This paper presents a novel method that leverages both the first-order and second-order methods in a unified algorithmic framework, termed FUSE, from which a practical version (PV) is derived accordingly. FUSE-PV stands as a simple yet efficient optimization method involving a switch-over between first and second orders. Additionally, we develop different criteria that determine when to switch. FUSE-PV has provably shown a smaller computational complexity than SGD and Adam. To validate our proposed scheme, we present an ablation study on several simple test functions and show a comparison with baselines for benchmark datasets.

View on arXiv
@article{jiang2025_2503.04204,
  title={ FUSE: First-Order and Second-Order Unified SynthEsis in Stochastic Optimization },
  author={ Zhanhong Jiang and Md Zahid Hasan and Aditya Balu and Joshua R. Waite and Genyi Huang and Soumik Sarkar },
  journal={arXiv preprint arXiv:2503.04204},
  year={ 2025 }
}
Comments on this paper