ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.17070
  4. Cited By
Novel-WD: Exploring acquisition of Novel World Knowledge in LLMs Using
  Prefix-Tuning

Novel-WD: Exploring acquisition of Novel World Knowledge in LLMs Using Prefix-Tuning

30 August 2024
Maxime Méloux
Christophe Cerisara
    KELM
    CLL
ArXivPDFHTML

Papers citing "Novel-WD: Exploring acquisition of Novel World Knowledge in LLMs Using Prefix-Tuning"

5 / 5 papers shown
Title
Large Language Models Understand and Can be Enhanced by Emotional
  Stimuli
Large Language Models Understand and Can be Enhanced by Emotional Stimuli
Cheng-rong Li
Jindong Wang
Yixuan Zhang
Kaijie Zhu
Wenxin Hou
Jianxun Lian
Fang Luo
Qiang Yang
Xingxu Xie
LRM
67
116
0
14 Jul 2023
An Efficient Memory-Augmented Transformer for Knowledge-Intensive NLP
  Tasks
An Efficient Memory-Augmented Transformer for Knowledge-Intensive NLP Tasks
Yuxiang Wu
Yu Zhao
Baotian Hu
Pasquale Minervini
Pontus Stenetorp
Sebastian Riedel
RALM
KELM
43
42
0
30 Oct 2022
Towards Continual Knowledge Learning of Language Models
Towards Continual Knowledge Learning of Language Models
Joel Jang
Seonghyeon Ye
Sohee Yang
Joongbo Shin
Janghoon Han
Gyeonghun Kim
Stanley Jungkyu Choi
Minjoon Seo
CLL
KELM
216
122
0
07 Oct 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
275
3,784
0
18 Apr 2021
Language Models as Knowledge Bases?
Language Models as Knowledge Bases?
Fabio Petroni
Tim Rocktaschel
Patrick Lewis
A. Bakhtin
Yuxiang Wu
Alexander H. Miller
Sebastian Riedel
KELM
AI4MH
393
2,216
0
03 Sep 2019
1