ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.12485
  4. Cited By
TreeBERT: A Tree-Based Pre-Trained Model for Programming Language

TreeBERT: A Tree-Based Pre-Trained Model for Programming Language

26 May 2021
Xue Jiang
Zhuoran Zheng
Chen Lyu
Liang Li
Lei Lyu
ArXivPDFHTML

Papers citing "TreeBERT: A Tree-Based Pre-Trained Model for Programming Language"

9 / 9 papers shown
Title
Text-to-Code Generation with Modality-relative Pre-training
Text-to-Code Generation with Modality-relative Pre-training
Fenia Christopoulou
Guchun Zhang
Gerasimos Lampouras
AI4TS
18
1
0
08 Feb 2024
Deep Learning for Code Intelligence: Survey, Benchmark and Toolkit
Deep Learning for Code Intelligence: Survey, Benchmark and Toolkit
Yao Wan
Yang He
Zhangqian Bi
Jianguo Zhang
Hongyu Zhang
Yulei Sui
Guandong Xu
Hai Jin
Philip S. Yu
27
20
0
30 Dec 2023
CoLadder: Supporting Programmers with Hierarchical Code Generation in
  Multi-Level Abstraction
CoLadder: Supporting Programmers with Hierarchical Code Generation in Multi-Level Abstraction
Ryan Yen
Jiawen Zhu
Sangho Suh
Haijun Xia
Jian Zhao
38
14
0
12 Oct 2023
Natural Language Generation and Understanding of Big Code for
  AI-Assisted Programming: A Review
Natural Language Generation and Understanding of Big Code for AI-Assisted Programming: A Review
M. Wong
Shangxin Guo
Ching Nam Hang
Siu-Wai Ho
C. Tan
35
78
0
04 Jul 2023
Neural Machine Translation for Code Generation
Neural Machine Translation for Code Generation
K. Dharma
Clayton T. Morrison
32
4
0
22 May 2023
PanGu-Coder: Program Synthesis with Function-Level Language Modeling
PanGu-Coder: Program Synthesis with Function-Level Language Modeling
Fenia Christopoulou
Gerasimos Lampouras
Milan Gritta
Guchun Zhang
Yinpeng Guo
...
Guangtai Liang
Jia Wei
Xin Jiang
Qianxiang Wang
Qun Liu
ELM
SyDa
ALM
32
74
0
22 Jul 2022
CODE-MVP: Learning to Represent Source Code from Multiple Views with
  Contrastive Pre-Training
CODE-MVP: Learning to Represent Source Code from Multiple Views with Contrastive Pre-Training
Xin Wang
Yasheng Wang
Yao Wan
Jiawei Wang
Pingyi Zhou
Li Li
Hao Wu
Jin Liu
19
33
0
04 May 2022
UniXcoder: Unified Cross-Modal Pre-training for Code Representation
UniXcoder: Unified Cross-Modal Pre-training for Code Representation
Daya Guo
Shuai Lu
Nan Duan
Yanlin Wang
Ming Zhou
Jian Yin
6
559
0
08 Mar 2022
Towards Learning (Dis)-Similarity of Source Code from Program Contrasts
Towards Learning (Dis)-Similarity of Source Code from Program Contrasts
Yangruibo Ding
Luca Buratti
Saurabh Pujar
Alessandro Morari
Baishakhi Ray
Saikat Chakraborty
8
36
0
08 Oct 2021
1