Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2403.08477
Cited By
Unleashing the Power of Meta-tuning for Few-shot Generalization Through Sparse Interpolated Experts
13 March 2024
Shengzhuang Chen
Jihoon Tack
Yunqiao Yang
Yee Whye Teh
Jonathan Richard Schwarz
Ying Wei
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Unleashing the Power of Meta-tuning for Few-shot Generalization Through Sparse Interpolated Experts"
8 / 8 papers shown
Title
From Sparse to Soft Mixtures of Experts
J. Puigcerver
C. Riquelme
Basil Mustafa
N. Houlsby
MoE
121
114
0
02 Aug 2023
Meta-Learning Sparse Compression Networks
Jonathan Richard Schwarz
Yee Whye Teh
49
25
0
18 May 2022
Mixture-of-Experts with Expert Choice Routing
Yan-Quan Zhou
Tao Lei
Han-Chu Liu
Nan Du
Yanping Huang
Vincent Zhao
Andrew M. Dai
Zhifeng Chen
Quoc V. Le
James Laudon
MoE
147
323
0
18 Feb 2022
Meta-learning via Language Model In-context Tuning
Yanda Chen
Ruiqi Zhong
Sheng Zha
George Karypis
He He
218
155
0
15 Oct 2021
Emerging Properties in Self-Supervised Vision Transformers
Mathilde Caron
Hugo Touvron
Ishan Misra
Hervé Jégou
Julien Mairal
Piotr Bojanowski
Armand Joulin
283
5,723
0
29 Apr 2021
Making Pre-trained Language Models Better Few-shot Learners
Tianyu Gao
Adam Fisch
Danqi Chen
241
1,898
0
31 Dec 2020
Rapid Learning or Feature Reuse? Towards Understanding the Effectiveness of MAML
Aniruddh Raghu
M. Raghu
Samy Bengio
Oriol Vinyals
170
634
0
19 Sep 2019
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
243
11,568
0
09 Mar 2017
1