ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.13513
  4. Cited By
PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning
v1v2v3 (latest)

PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning

28 April 2020
Arthur Douillard
Matthieu Cord
Charles Ollion
Thomas Robert
Eduardo Valle
    CLL
ArXiv (abs)PDFHTML

Papers citing "PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning"

3 / 3 papers shown
Learning Representations for New Sound Classes With Continual
  Self-Supervised Learning
Learning Representations for New Sound Classes With Continual Self-Supervised LearningIEEE Signal Processing Letters (SPL), 2022
Zhepei Wang
Cem Subakan
Xilin Jiang
Junkai Wu
Efthymios Tzinis
Mirco Ravanelli
Paris Smaragdis
CLLSSL
297
20
0
15 May 2022
MgSvF: Multi-Grained Slow vs. Fast Framework for Few-Shot
  Class-Incremental Learning
MgSvF: Multi-Grained Slow vs. Fast Framework for Few-Shot Class-Incremental Learning
Hanbin Zhao
Yongjian Fu
Mintong Kang
Qi Tian
Leilei Gan
Xi Li
CLL
619
142
0
28 Jun 2020
Insights from the Future for Continual Learning
Insights from the Future for Continual Learning
Arthur Douillard
Eduardo Valle
Charles Ollion
Thomas Robert
Matthieu Cord
VLMCLL
271
12
0
24 Jun 2020
1
Page 1 of 1