Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2312.00471
Cited By
A Bayesian approach for prompt optimization in pre-trained language models
1 December 2023
Antonio Sabbatella
Andrea Ponti
Antonio Candelieri
I. Giordani
F. Archetti
Re-assign community
ArXiv
PDF
HTML
Papers citing
"A Bayesian approach for prompt optimization in pre-trained language models"
8 / 8 papers shown
Title
Reliable Gradient-free and Likelihood-free Prompt Tuning
Maohao Shen
S. Ghosh
P. Sattigeri
Subhro Das
Yuheng Bu
G. Wornell
VLM
46
10
0
30 Apr 2023
Bayesian Optimization over High-Dimensional Combinatorial Spaces via Dictionary-based Embeddings
Aryan Deshwal
Sebastian Ament
Maximilian Balandat
E. Bakshy
J. Doppa
David Eriksson
45
19
0
03 Mar 2023
BBTv2: Towards a Gradient-Free Future with Large Language Models
Tianxiang Sun
Zhengfu He
Hong Qian
Yunhua Zhou
Xuanjing Huang
Xipeng Qiu
108
53
0
23 May 2022
Local Latent Space Bayesian Optimization over Structured Inputs
Natalie Maus
Haydn Thomas Jones
Juston Moore
Matt J. Kusner
John Bradshaw
Jacob R. Gardner
BDL
51
69
0
28 Jan 2022
P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks
Xiao Liu
Kaixuan Ji
Yicheng Fu
Weng Lam Tam
Zhengxiao Du
Zhilin Yang
Jie Tang
VLM
238
806
0
14 Oct 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
280
3,848
0
18 Apr 2021
What Makes Good In-Context Examples for GPT-
3
3
3
?
Jiachang Liu
Dinghan Shen
Yizhe Zhang
Bill Dolan
Lawrence Carin
Weizhu Chen
AAML
RALM
275
1,312
0
17 Jan 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
1