ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.00468
46
408

Implicit Bias of Gradient Descent on Linear Convolutional Networks

1 June 2018
Suriya Gunasekar
Jason D. Lee
Daniel Soudry
Nathan Srebro
    MDE
ArXivPDFHTML
Abstract

We show that gradient descent on full-width linear convolutional networks of depth LLL converges to a linear predictor related to the ℓ2/L\ell_{2/L}ℓ2/L​ bridge penalty in the frequency domain. This is in contrast to linearly fully connected networks, where gradient descent converges to the hard margin linear support vector machine solution, regardless of depth.

View on arXiv
Comments on this paper