ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.10873
17
0

S2S^{2}S2-LBI: Stochastic Split Linearized Bregman Iterations for Parsimonious Deep Learning

24 April 2019
Yanwei Fu
Donghao Li
Xinwei Sun
Shun Zhang
Yizhou Wang
Y. Yao
ArXivPDFHTML
Abstract

This paper proposes a novel Stochastic Split Linearized Bregman Iteration (S2S^{2}S2-LBI) algorithm to efficiently train the deep network. The S2S^{2}S2-LBI introduces an iterative regularization path with structural sparsity. Our S2S^{2}S2-LBI combines the computational efficiency of the LBI, and model selection consistency in learning the structural sparsity. The computed solution path intrinsically enables us to enlarge or simplify a network, which theoretically, is benefited from the dynamics property of our S2S^{2}S2-LBI algorithm. The experimental results validate our S2S^{2}S2-LBI on MNIST and CIFAR-10 dataset. For example, in MNIST, we can either boost a network with only 1.5K parameters (1 convolutional layer of 5 filters, and 1 FC layer), achieves 98.40\% recognition accuracy; or we simplify 82.5%82.5\%82.5% of parameters in LeNet-5 network, and still achieves the 98.47\% recognition accuracy. In addition, we also have the learning results on ImageNet, which will be added in the next version of our report.

View on arXiv
Comments on this paper