ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.13588
  4. Cited By
Genetic Auto-prompt Learning for Pre-trained Code Intelligence Language
  Models

Genetic Auto-prompt Learning for Pre-trained Code Intelligence Language Models

20 March 2024
Chengzhe Feng
Yanan Sun
Ke Li
Pan Zhou
Jiancheng Lv
Aojun Lu
ArXivPDFHTML

Papers citing "Genetic Auto-prompt Learning for Pre-trained Code Intelligence Language Models"

10 / 10 papers shown
Title
An investigation on the use of Large Language Models for hyperparameter
  tuning in Evolutionary Algorithms
An investigation on the use of Large Language Models for hyperparameter tuning in Evolutionary Algorithms
Leonardo Lucio Custode
Fabio Caraffini
Anil Yaman
Giovanni Iacca
35
2
0
05 Aug 2024
No More Fine-Tuning? An Experimental Evaluation of Prompt Tuning in Code
  Intelligence
No More Fine-Tuning? An Experimental Evaluation of Prompt Tuning in Code Intelligence
Chaozheng Wang
Yuanhang Yang
Cuiyun Gao
Yun Peng
Hongyu Zhang
Michael R. Lyu
AAML
23
129
0
24 Jul 2022
SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer
SPoT: Better Frozen Model Adaptation through Soft Prompt Transfer
Tu Vu
Brian Lester
Noah Constant
Rami Al-Rfou
Daniel Matthew Cer
VLM
LRM
131
276
0
15 Oct 2021
An Empirical Study of GPT-3 for Few-Shot Knowledge-Based VQA
An Empirical Study of GPT-3 for Few-Shot Knowledge-Based VQA
Zhengyuan Yang
Zhe Gan
Jianfeng Wang
Xiaowei Hu
Yumao Lu
Zicheng Liu
Lijuan Wang
161
401
0
10 Sep 2021
A Recipe For Arbitrary Text Style Transfer with Large Language Models
A Recipe For Arbitrary Text Style Transfer with Large Language Models
Emily Reif
Daphne Ippolito
Ann Yuan
Andy Coenen
Chris Callison-Burch
Jason W. Wei
202
117
0
08 Sep 2021
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for
  Code Understanding and Generation
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
Yue Wang
Weishi Wang
Shafiq R. Joty
S. Hoi
196
1,451
0
02 Sep 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
275
3,784
0
18 Apr 2021
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Understanding
  and Generation
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Understanding and Generation
Shuai Lu
Daya Guo
Shuo Ren
Junjie Huang
Alexey Svyatkovskiy
...
Nan Duan
Neel Sundaresan
Shao Kun Deng
Shengyu Fu
Shujie Liu
ELM
183
1,098
0
09 Feb 2021
WARP: Word-level Adversarial ReProgramming
WARP: Word-level Adversarial ReProgramming
Karen Hambardzumyan
Hrant Khachatrian
Jonathan May
AAML
243
340
0
01 Jan 2021
Exploiting Cloze Questions for Few Shot Text Classification and Natural
  Language Inference
Exploiting Cloze Questions for Few Shot Text Classification and Natural Language Inference
Timo Schick
Hinrich Schütze
248
1,382
0
21 Jan 2020
1