Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2305.05968
Cited By
Investigating Forgetting in Pre-Trained Representations Through Continual Learning
10 May 2023
Yun Luo
Zhen Yang
Xuefeng Bai
Fandong Meng
Jie Zhou
Yue Zhang
CLL
KELM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Investigating Forgetting in Pre-Trained Representations Through Continual Learning"
4 / 4 papers shown
Title
DiTASK: Multi-Task Fine-Tuning with Diffeomorphic Transformations
Krishna Sri Ipsit Mantri
Carola-Bibiane Schönlieb
Bruno Ribeiro
Chaim Baskin
Moshe Eliasof
43
0
0
09 Feb 2025
Investigating Continual Pretraining in Large Language Models: Insights and Implications
cCaugatay Yildiz
Nishaanth Kanna Ravichandran
Prishruit Punia
Matthias Bethge
B. Ermiş
CLL
KELM
LRM
50
25
0
27 Feb 2024
Making Pre-trained Language Models Better Few-shot Learners
Tianyu Gao
Adam Fisch
Danqi Chen
241
1,918
0
31 Dec 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,956
0
20 Apr 2018
1