Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2005.11700
Cited By
When does MAML Work the Best? An Empirical Study on Model-Agnostic Meta-Learning in NLP Applications
24 May 2020
Zequn Liu
Ruiyi Zhang
Yiping Song
Wei Ju
Ming Zhang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"When does MAML Work the Best? An Empirical Study on Model-Agnostic Meta-Learning in NLP Applications"
7 / 7 papers shown
Title
A Survey of Data-Efficient Graph Learning
Wei Ju
Siyu Yi
Yifan Wang
Qingqing Long
Junyu Luo
Zhiping Xiao
Ming Zhang
GNN
27
26
0
01 Feb 2024
An Introduction to Bi-level Optimization: Foundations and Applications in Signal Processing and Machine Learning
Yihua Zhang
Prashant Khanduri
Ioannis C. Tsaknakis
Yuguang Yao
Min-Fong Hong
Sijia Liu
AI4CE
38
25
0
01 Aug 2023
Meta-Learning with a Geometry-Adaptive Preconditioner
Suhyun Kang
Duhun Hwang
Moonjung Eo
Taesup Kim
Wonjong Rhee
AI4CE
22
15
0
04 Apr 2023
Unsupervised Neural Stylistic Text Generation using Transfer learning and Adapters
Vinayshekhar Bannihatti Kumar
Rashmi Gangadharaiah
Dan Roth
17
1
0
07 Oct 2022
Sign-MAML: Efficient Model-Agnostic Meta-Learning by SignSGD
Chen Fan
Parikshit Ram
Sijia Liu
FedML
53
16
0
15 Sep 2021
An Overview of Deep Learning Architectures in Few-Shot Learning Domain
Shruti Jadon
Aryan Jadon
VLM
16
55
0
12 Aug 2020
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
290
11,681
0
09 Mar 2017
1