ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.15621
  4. Cited By
Shrink-Perturb Improves Architecture Mixing during Population Based
  Training for Neural Architecture Search

Shrink-Perturb Improves Architecture Mixing during Population Based Training for Neural Architecture Search

28 July 2023
A. Chebykin
A. Dushatskiy
T. Alderliesten
Peter A. N. Bosman
ArXivPDFHTML

Papers citing "Shrink-Perturb Improves Architecture Mixing during Population Based Training for Neural Architecture Search"

5 / 5 papers shown
Title
Git Re-Basin: Merging Models modulo Permutation Symmetries
Git Re-Basin: Merging Models modulo Permutation Symmetries
Samuel K. Ainsworth
J. Hayase
S. Srinivasa
MoMe
239
313
0
11 Sep 2022
Core-set Sampling for Efficient Neural Architecture Search
Core-set Sampling for Efficient Neural Architecture Search
Jaewoong Shim
Kyeongbo Kong
Suk-Ju Kang
122
23
0
08 Jul 2021
D2RL: Deep Dense Architectures in Reinforcement Learning
D2RL: Deep Dense Architectures in Reinforcement Learning
Samarth Sinha
Homanga Bharadhwaj
A. Srinivas
Animesh Garg
OffRL
AI4CE
43
56
0
19 Oct 2020
Efficient Multi-objective Neural Architecture Search via Lamarckian
  Evolution
Efficient Multi-objective Neural Architecture Search via Lamarckian Evolution
T. Elsken
J. H. Metzen
Frank Hutter
117
498
0
24 Apr 2018
Neural Architecture Search with Reinforcement Learning
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,290
0
05 Nov 2016
1