ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.13289
  4. Cited By
Improving Representational Continuity via Continued Pretraining

Improving Representational Continuity via Continued Pretraining

26 February 2023
Michael Sun
Ananya Kumar
Divyam Madaan
Percy Liang
    CLL
ArXivPDFHTML

Papers citing "Improving Representational Continuity via Continued Pretraining"

2 / 2 papers shown
Title
Adapting BERT for Continual Learning of a Sequence of Aspect Sentiment
  Classification Tasks
Adapting BERT for Continual Learning of a Sequence of Aspect Sentiment Classification Tasks
Zixuan Ke
Hu Xu
Bing-Quan Liu
CLL
232
84
0
06 Dec 2021
Representational Continuity for Unsupervised Continual Learning
Representational Continuity for Unsupervised Continual Learning
Divyam Madaan
Jaehong Yoon
Yuanchun Li
Yunxin Liu
Sung Ju Hwang
CLL
SSL
62
111
0
13 Oct 2021
1