ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.00659
  4. Cited By
Implicit Regularization of Stochastic Gradient Descent in Natural
  Language Processing: Observations and Implications

Implicit Regularization of Stochastic Gradient Descent in Natural Language Processing: Observations and Implications

1 November 2018
Deren Lei
Zichen Sun
Yijun Xiao
William Yang Wang
ArXivPDFHTML

Papers citing "Implicit Regularization of Stochastic Gradient Descent in Natural Language Processing: Observations and Implications"

4 / 4 papers shown
Title
Why Deep Learning Generalizes
Why Deep Learning Generalizes
Benjamin L. Badger
TDI
AI4CE
20
3
0
17 Nov 2022
The Discovery of Dynamics via Linear Multistep Methods and Deep
  Learning: Error Estimation
The Discovery of Dynamics via Linear Multistep Methods and Deep Learning: Error Estimation
Q. Du
Yiqi Gu
Haizhao Yang
Chao Zhou
16
20
0
21 Mar 2021
Reproducing Activation Function for Deep Learning
Reproducing Activation Function for Deep Learning
Senwei Liang
Liyao Lyu
Chunmei Wang
Haizhao Yang
28
21
0
13 Jan 2021
Convolutional Neural Networks for Sentence Classification
Convolutional Neural Networks for Sentence Classification
Yoon Kim
AILaw
VLM
250
13,360
0
25 Aug 2014
1