ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.01701
  4. Cited By
Extending Source Code Pre-Trained Language Models to Summarise
  Decompiled Binaries

Extending Source Code Pre-Trained Language Models to Summarise Decompiled Binaries

4 January 2023
Ali Al-Kaswan
Toufique Ahmed
M. Izadi
A. Sawant
Prem Devanbu
A. van Deursen
    SyDa
ArXivPDFHTML

Papers citing "Extending Source Code Pre-Trained Language Models to Summarise Decompiled Binaries"

5 / 5 papers shown
Title
FoC: Figure out the Cryptographic Functions in Stripped Binaries with LLMs
FoC: Figure out the Cryptographic Functions in Stripped Binaries with LLMs
Guoqiang Chen
Xiuwei Shang
Shaoyin Cheng
Yanming Zhang
Weiming Zhang
Neng H. Yu
N. Yu
86
2
0
27 Mar 2024
LLMs in the Heart of Differential Testing: A Case Study on a Medical Rule Engine
LLMs in the Heart of Differential Testing: A Case Study on a Medical Rule Engine
Erblin Isaku
Christoph Laaber
Hassan Sartaj
Shaukat Ali
T. Schwitalla
J. F. Nygård
22
2
0
16 Feb 2024
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for
  Code Understanding and Generation
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
Yue Wang
Weishi Wang
Shafiq R. Joty
S. Hoi
201
1,451
0
02 Sep 2021
On the Evaluation of Neural Code Summarization
On the Evaluation of Neural Code Summarization
Ensheng Shi
Yanlin Wang
Lun Du
Junjie Chen
Shi Han
Hongyu Zhang
Dongmei Zhang
Hongbin Sun
ELM
110
69
0
15 Jul 2021
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Understanding
  and Generation
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Understanding and Generation
Shuai Lu
Daya Guo
Shuo Ren
Junjie Huang
Alexey Svyatkovskiy
...
Nan Duan
Neel Sundaresan
Shao Kun Deng
Shengyu Fu
Shujie Liu
ELM
190
853
0
09 Feb 2021
1