Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2408.17070
Cited By
Novel-WD: Exploring acquisition of Novel World Knowledge in LLMs Using Prefix-Tuning
30 August 2024
Maxime Méloux
Christophe Cerisara
KELM
CLL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Novel-WD: Exploring acquisition of Novel World Knowledge in LLMs Using Prefix-Tuning"
5 / 5 papers shown
Title
Large Language Models Understand and Can be Enhanced by Emotional Stimuli
Cheng-rong Li
Jindong Wang
Yixuan Zhang
Kaijie Zhu
Wenxin Hou
Jianxun Lian
Fang Luo
Qiang Yang
Xingxu Xie
LRM
67
116
0
14 Jul 2023
An Efficient Memory-Augmented Transformer for Knowledge-Intensive NLP Tasks
Yuxiang Wu
Yu Zhao
Baotian Hu
Pasquale Minervini
Pontus Stenetorp
Sebastian Riedel
RALM
KELM
43
42
0
30 Oct 2022
Towards Continual Knowledge Learning of Language Models
Joel Jang
Seonghyeon Ye
Sohee Yang
Joongbo Shin
Janghoon Han
Gyeonghun Kim
Stanley Jungkyu Choi
Minjoon Seo
CLL
KELM
222
122
0
07 Oct 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
278
3,784
0
18 Apr 2021
Language Models as Knowledge Bases?
Fabio Petroni
Tim Rocktaschel
Patrick Lewis
A. Bakhtin
Yuxiang Wu
Alexander H. Miller
Sebastian Riedel
KELM
AI4MH
393
2,216
0
03 Sep 2019
1