ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.19816
  4. Cited By
Growing Tiny Networks: Spotting Expressivity Bottlenecks and Fixing Them
  Optimally

Growing Tiny Networks: Spotting Expressivity Bottlenecks and Fixing Them Optimally

30 May 2024
Manon Verbockhaven
Sylvain Chevallier
Guillaume Charpiat
ArXivPDFHTML

Papers citing "Growing Tiny Networks: Spotting Expressivity Bottlenecks and Fixing Them Optimally"

5 / 5 papers shown
Title
Growth strategies for arbitrary DAG neural architectures
Growth strategies for arbitrary DAG neural architectures
Stella Douka
Manon Verbockhaven
Théo Rudkiewicz
Stéphane Rivaud
François P. Landes
Sylvain Chevallier
Guillaume Charpiat
AI4CE
43
0
0
17 Feb 2025
ANaGRAM: A Natural Gradient Relative to Adapted Model for efficient PINNs learning
ANaGRAM: A Natural Gradient Relative to Adapted Model for efficient PINNs learning
Nilo Schwencke
Cyril Furtlehner
64
1
0
14 Dec 2024
Firefly Neural Architecture Descent: a General Approach for Growing
  Neural Networks
Firefly Neural Architecture Descent: a General Approach for Growing Neural Networks
Lemeng Wu
Bo Liu
Peter Stone
Qiang Liu
51
55
0
17 Feb 2021
NetAdapt: Platform-Aware Neural Network Adaptation for Mobile
  Applications
NetAdapt: Platform-Aware Neural Network Adaptation for Mobile Applications
Tien-Ju Yang
Andrew G. Howard
Bo Chen
Xiao Zhang
Alec Go
Mark Sandler
Vivienne Sze
Hartwig Adam
88
513
0
09 Apr 2018
Neural Architecture Search with Reinforcement Learning
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,319
0
05 Nov 2016
1