Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2305.04106
Cited By
On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code
6 May 2023
M. Weyssow
Xin Zhou
Kisub Kim
David Lo
H. Sahraoui
CLL
KELM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code"
6 / 6 papers shown
Title
GitChameleon: Unmasking the Version-Switching Capabilities of Code Generation Models
Nizar Islah
Justine Gehring
Diganta Misra
Eilif B. Muller
Irina Rish
Terry Yue Zhuo
Massimo Caccia
SyDa
36
1
0
05 Nov 2024
No More Fine-Tuning? An Experimental Evaluation of Prompt Tuning in Code Intelligence
Chaozheng Wang
Yuanhang Yang
Cuiyun Gao
Yun Peng
Hongyu Zhang
Michael R. Lyu
AAML
49
129
0
24 Jul 2022
A Systematic Evaluation of Large Language Models of Code
Frank F. Xu
Uri Alon
Graham Neubig
Vincent J. Hellendoorn
ELM
ALM
196
624
0
26 Feb 2022
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
Yue Wang
Weishi Wang
Shafiq R. Joty
S. Hoi
204
1,451
0
02 Sep 2021
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Understanding and Generation
Shuai Lu
Daya Guo
Shuo Ren
Junjie Huang
Alexey Svyatkovskiy
...
Nan Duan
Neel Sundaresan
Shao Kun Deng
Shengyu Fu
Shujie Liu
ELM
190
853
0
09 Feb 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,927
0
20 Apr 2018
1