How Much Knowledge Can You Pack Into the Parameters of a Language Model?
v1v2v3v4 (latest)

How Much Knowledge Can You Pack Into the Parameters of a Language Model?

Conference on Empirical Methods in Natural Language Processing (EMNLP), 2025
    KELM

Papers citing "How Much Knowledge Can You Pack Into the Parameters of a Language Model?"

50 / 627 papers shown
Title
Towards Few-Shot Fact-Checking via Perplexity
Towards Few-Shot Fact-Checking via PerplexityNorth American Chapter of the Association for Computational Linguistics (NAACL), 2025
Nayeon Lee
Yejin Bang
Andrea Madotto
Madian Khabsa
Pascale Fung
97
94
0
17 Mar 2021
Get Your Vitamin C! Robust Fact Verification with Contrastive Evidence
Get Your Vitamin C! Robust Fact Verification with Contrastive EvidenceNorth American Chapter of the Association for Computational Linguistics (NAACL), 2025
142
249
0
15 Mar 2021
Learning Dense Representations of Phrases at Scale
Learning Dense Representations of Phrases at ScaleAnnual Meeting of the Association for Computational Linguistics (ACL), 2025
157
123
0
23 Dec 2020
Entity Linking in 100 Languages
Entity Linking in 100 LanguagesConference on Empirical Methods in Natural Language Processing (EMNLP), 2025
86
81
0
05 Nov 2020
Differentiable Open-Ended Commonsense Reasoning
Differentiable Open-Ended Commonsense ReasoningNorth American Chapter of the Association for Computational Linguistics (NAACL), 2025
114
43
0
24 Oct 2020
Answering Complex Open-Domain Questions with Multi-Hop Dense Retrieval
Answering Complex Open-Domain Questions with Multi-Hop Dense RetrievalInternational Conference on Learning Representations (ICLR), 2025
Wenhan Xiong
Xiang Lorraine Li
Srini Iyer
Jingfei Du
Patrick Lewis
...
Yashar Mehdad
Anuj Kumar
Sebastian Riedel
Douwe Kiela
Barlas Oğuz
120
203
0
27 Sep 2020