Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2410.01380
Cited By
Knowledge Entropy Decay during Language Model Pretraining Hinders New Knowledge Acquisition
2 October 2024
Jiyeon Kim
Hyunji Lee
Hyowon Cho
Joel Jang
Hyeonbin Hwang
Seungpil Won
Youbin Ahn
Dohaeng Lee
Minjoon Seo
KELM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Entropy Decay during Language Model Pretraining Hinders New Knowledge Acquisition"
1 / 1 papers shown
Title
How Do LLMs Acquire New Knowledge? A Knowledge Circuits Perspective on Continual Pre-Training
Yixin Ou
Yunzhi Yao
N. Zhang
Hui Jin
Jiacheng Sun
Shumin Deng
Z. Li
H. Chen
KELM
CLL
49
0
0
16 Feb 2025
1