ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2209.09658
  4. Cited By
Lazy vs hasty: linearization in deep networks impacts learning schedule
  based on example difficulty

Lazy vs hasty: linearization in deep networks impacts learning schedule based on example difficulty

19 September 2022
Thomas George
Guillaume Lajoie
A. Baratin
ArXivPDFHTML

Papers citing "Lazy vs hasty: linearization in deep networks impacts learning schedule based on example difficulty"

5 / 5 papers shown
Title
Mislabeled examples detection viewed as probing machine learning models:
  concepts, survey and extensive benchmark
Mislabeled examples detection viewed as probing machine learning models: concepts, survey and extensive benchmark
Thomas George
Pierre Nodet
A. Bondu
Vincent Lemaire
VLM
23
0
0
21 Oct 2024
Are Sparse Neural Networks Better Hard Sample Learners?
Are Sparse Neural Networks Better Hard Sample Learners?
Q. Xiao
Boqian Wu
Lu Yin
Christopher Neil Gadzinski
Tianjin Huang
Mykola Pechenizkiy
D. Mocanu
35
1
0
13 Sep 2024
How connectivity structure shapes rich and lazy learning in neural
  circuits
How connectivity structure shapes rich and lazy learning in neural circuits
Yuhan Helena Liu
A. Baratin
Jonathan H. Cornford
Stefan Mihalas
E. Shea-Brown
Guillaume Lajoie
38
14
0
12 Oct 2023
Geometric compression of invariant manifolds in neural nets
Geometric compression of invariant manifolds in neural nets
J. Paccolat
Leonardo Petrini
Mario Geiger
Kevin Tyloo
M. Wyart
MLT
47
34
0
22 Jul 2020
An Investigation of Why Overparameterization Exacerbates Spurious
  Correlations
An Investigation of Why Overparameterization Exacerbates Spurious Correlations
Shiori Sagawa
Aditi Raghunathan
Pang Wei Koh
Percy Liang
144
369
0
09 May 2020
1