ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.01421
  4. Cited By
Semiparametric Language Models Are Scalable Continual Learners

Semiparametric Language Models Are Scalable Continual Learners

2 March 2023
Guangyue Peng
Tao Ge
Si-Qing Chen
Furu Wei
Houfeng Wang
    KELM
ArXivPDFHTML

Papers citing "Semiparametric Language Models Are Scalable Continual Learners"

6 / 6 papers shown
Title
Learn or Recall? Revisiting Incremental Learning with Pre-trained
  Language Models
Learn or Recall? Revisiting Incremental Learning with Pre-trained Language Models
Junhao Zheng
Shengjie Qiu
Qianli Ma
13
9
0
13 Dec 2023
Continual Training of Language Models for Few-Shot Learning
Continual Training of Language Models for Few-Shot Learning
Zixuan Ke
Haowei Lin
Yijia Shao
Hu Xu
Lei Shu
Bin Liu
KELM
BDL
CLL
83
33
0
11 Oct 2022
Neuro-Symbolic Language Modeling with Automaton-augmented Retrieval
Neuro-Symbolic Language Modeling with Automaton-augmented Retrieval
Uri Alon
Frank F. Xu
Junxian He
Sudipta Sengupta
Dan Roth
Graham Neubig
RALM
67
62
0
28 Jan 2022
LFPT5: A Unified Framework for Lifelong Few-shot Language Learning Based
  on Prompt Tuning of T5
LFPT5: A Unified Framework for Lifelong Few-shot Language Learning Based on Prompt Tuning of T5
Chengwei Qin
Shafiq R. Joty
CLL
150
96
0
14 Oct 2021
Towards Continual Knowledge Learning of Language Models
Towards Continual Knowledge Learning of Language Models
Joel Jang
Seonghyeon Ye
Sohee Yang
Joongbo Shin
Janghoon Han
Gyeonghun Kim
Stanley Jungkyu Choi
Minjoon Seo
CLL
KELM
216
122
0
07 Oct 2021
Efficient Nearest Neighbor Language Models
Efficient Nearest Neighbor Language Models
Junxian He
Graham Neubig
Taylor Berg-Kirkpatrick
RALM
185
103
0
09 Sep 2021
1