ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.09653
  4. Cited By
On the Transferability of Pre-trained Language Models for Low-Resource
  Programming Languages

On the Transferability of Pre-trained Language Models for Low-Resource Programming Languages

5 April 2022
Fuxiang Chen
F. Fard
David Lo
T. Bryksin
ArXivPDFHTML

Papers citing "On the Transferability of Pre-trained Language Models for Low-Resource Programming Languages"

7 / 7 papers shown
Title
Automatically Generating UI Code from Screenshot: A Divide-and-Conquer-Based Approach
Automatically Generating UI Code from Screenshot: A Divide-and-Conquer-Based Approach
Yuxuan Wan
Chaozheng Wang
Yi Dong
Wenxuan Wang
Shuqing Li
Yintong Huo
M. Lyu
3DV
69
10
0
24 Jun 2024
Learning Agent-based Modeling with LLM Companions: Experiences of
  Novices and Experts Using ChatGPT & NetLogo Chat
Learning Agent-based Modeling with LLM Companions: Experiences of Novices and Experts Using ChatGPT & NetLogo Chat
John Chen
Xi Lu
Michael Rejtig
Yuzhou Du
Ruth Bagley
Mike Horn
Uri Wilensky
18
29
0
30 Jan 2024
Deep Learning for Code Intelligence: Survey, Benchmark and Toolkit
Deep Learning for Code Intelligence: Survey, Benchmark and Toolkit
Yao Wan
Yang He
Zhangqian Bi
Jianguo Zhang
Hongyu Zhang
Yulei Sui
Guandong Xu
Hai Jin
Philip S. Yu
27
20
0
30 Dec 2023
A Comprehensive Evaluation of Parameter-Efficient Fine-Tuning on
  Software Engineering Tasks
A Comprehensive Evaluation of Parameter-Efficient Fine-Tuning on Software Engineering Tasks
Wentao Zou
Qi Li
Jidong Ge
Chuanyi Li
Xiaoyu Shen
LiGuo Huang
Bin Luo
24
5
0
25 Dec 2023
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for
  Code Understanding and Generation
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
Yue Wang
Weishi Wang
Shafiq R. Joty
S. Hoi
210
1,489
0
02 Sep 2021
Pre-trained Models for Natural Language Processing: A Survey
Pre-trained Models for Natural Language Processing: A Survey
Xipeng Qiu
Tianxiang Sun
Yige Xu
Yunfan Shao
Ning Dai
Xuanjing Huang
LM&MA
VLM
243
1,450
0
18 Mar 2020
Text Summarization with Pretrained Encoders
Text Summarization with Pretrained Encoders
Yang Liu
Mirella Lapata
MILM
254
1,430
0
22 Aug 2019
1