Distributed stochastic gradient tracking algorithm with variance
reduction for non-convex optimizationIEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2021 |
FastAdaBelief: Improving Convergence Rate for Belief-based Adaptive
Optimizers by Exploiting Strong ConvexityIEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2021 |
Accelerating Mini-batch SARAH by Step Size RulesInformation Sciences (Inf. Sci.), 2019 |
A Survey of Optimization Methods from a Machine Learning PerspectiveIEEE Transactions on Cybernetics (IEEE Trans. Cybern.), 2019 |