ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.01380
  4. Cited By
Knowledge Entropy Decay during Language Model Pretraining Hinders New Knowledge Acquisition

Knowledge Entropy Decay during Language Model Pretraining Hinders New Knowledge Acquisition

2 October 2024
Jiyeon Kim
Hyunji Lee
Hyowon Cho
Joel Jang
Hyeonbin Hwang
Seungpil Won
Youbin Ahn
Dohaeng Lee
Minjoon Seo
    KELM
ArXivPDFHTML

Papers citing "Knowledge Entropy Decay during Language Model Pretraining Hinders New Knowledge Acquisition"

1 / 1 papers shown
Title
How Do LLMs Acquire New Knowledge? A Knowledge Circuits Perspective on Continual Pre-Training
How Do LLMs Acquire New Knowledge? A Knowledge Circuits Perspective on Continual Pre-Training
Yixin Ou
Yunzhi Yao
N. Zhang
Hui Jin
Jiacheng Sun
Shumin Deng
Z. Li
H. Chen
KELM
CLL
49
0
0
16 Feb 2025
1