Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2004.05859
Cited By
Regularizing Meta-Learning via Gradient Dropout
13 April 2020
Hung-Yu Tseng
Yi-Wen Chen
Yi-Hsuan Tsai
Sifei Liu
Yen-Yu Lin
Ming-Hsuan Yang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Regularizing Meta-Learning via Gradient Dropout"
7 / 7 papers shown
Title
A Channel Coding Benchmark for Meta-Learning
Rui Li
Ondrej Bohdal
Rajesh K. Mishra
Hyeji Kim
Da Li
Nicholas D. Lane
Timothy M. Hospedales
33
9
0
15 Jul 2021
Regularizing Generative Adversarial Networks under Limited Data
Hung-Yu Tseng
Lu Jiang
Ce Liu
Ming-Hsuan Yang
Weilong Yang
GAN
35
142
0
07 Apr 2021
Just Pick a Sign: Optimizing Deep Multitask Models with Gradient Sign Dropout
Zhao Chen
Jiquan Ngiam
Yanping Huang
Thang Luong
Henrik Kretzschmar
Yuning Chai
Dragomir Anguelov
41
207
0
14 Oct 2020
Meta-Learning Requires Meta-Augmentation
Janarthanan Rajendran
A. Irpan
Eric Jang
24
93
0
10 Jul 2020
Bayesian Model-Agnostic Meta-Learning
Taesup Kim
Jaesik Yoon
Ousmane Amadou Dia
Sungwoong Kim
Yoshua Bengio
Sungjin Ahn
UQCV
BDL
228
498
0
11 Jun 2018
Probabilistic Model-Agnostic Meta-Learning
Chelsea Finn
Kelvin Xu
Sergey Levine
BDL
176
666
0
07 Jun 2018
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
457
11,715
0
09 Mar 2017
1