Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2301.01701
Cited By
Extending Source Code Pre-Trained Language Models to Summarise Decompiled Binaries
4 January 2023
Ali Al-Kaswan
Toufique Ahmed
M. Izadi
A. Sawant
Prem Devanbu
A. van Deursen
SyDa
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Extending Source Code Pre-Trained Language Models to Summarise Decompiled Binaries"
5 / 5 papers shown
Title
FoC: Figure out the Cryptographic Functions in Stripped Binaries with LLMs
Guoqiang Chen
Xiuwei Shang
Shaoyin Cheng
Yanming Zhang
Weiming Zhang
Neng H. Yu
N. Yu
92
2
0
27 Mar 2024
LLMs in the Heart of Differential Testing: A Case Study on a Medical Rule Engine
Erblin Isaku
Christoph Laaber
Hassan Sartaj
Shaukat Ali
T. Schwitalla
J. F. Nygård
28
2
0
16 Feb 2024
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
Yue Wang
Weishi Wang
Shafiq R. Joty
S. Hoi
204
1,451
0
02 Sep 2021
On the Evaluation of Neural Code Summarization
Ensheng Shi
Yanlin Wang
Lun Du
Junjie Chen
Shi Han
Hongyu Zhang
Dongmei Zhang
Hongbin Sun
ELM
110
85
0
15 Jul 2021
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Understanding and Generation
Shuai Lu
Daya Guo
Shuo Ren
Junjie Huang
Alexey Svyatkovskiy
...
Nan Duan
Neel Sundaresan
Shao Kun Deng
Shengyu Fu
Shujie Liu
ELM
190
853
0
09 Feb 2021
1