Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.01561
Cited By
Can Generative Pre-trained Language Models Serve as Knowledge Bases for Closed-book QA?
3 June 2021
Cunxiang Wang
Pai Liu
Yue Zhang
RALM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Can Generative Pre-trained Language Models Serve as Knowledge Bases for Closed-book QA?"
6 / 56 papers shown
Title
Recent Advances in Natural Language Processing via Large Pre-Trained Language Models: A Survey
Bonan Min
Hayley L Ross
Elior Sulem
Amir Pouran Ben Veyseh
Thien Huu Nguyen
Oscar Sainz
Eneko Agirre
Ilana Heinz
Dan Roth
LM&MA
VLM
AI4CE
83
1,035
0
01 Nov 2021
Towards Continual Knowledge Learning of Language Models
Joel Jang
Seonghyeon Ye
Sohee Yang
Joongbo Shin
Janghoon Han
Gyeonghun Kim
Stanley Jungkyu Choi
Minjoon Seo
CLL
KELM
230
151
0
07 Oct 2021
Exploring Generalization Ability of Pretrained Language Models on Arithmetic and Logical Reasoning
Cunxiang Wang
Boyuan Zheng
Y. Niu
Yue Zhang
LRM
33
22
0
15 Aug 2021
Retrieval-Free Knowledge-Grounded Dialogue Response Generation with Adapters
Yan Xu
Etsuko Ishii
Samuel Cahyawijaya
Zihan Liu
Genta Indra Winata
Andrea Madotto
Dan Su
Pascale Fung
RALM
25
44
0
13 May 2021
Relational World Knowledge Representation in Contextual Language Models: A Review
Tara Safavi
Danai Koutra
KELM
35
51
0
12 Apr 2021
Language Models as Knowledge Bases?
Fabio Petroni
Tim Rocktaschel
Patrick Lewis
A. Bakhtin
Yuxiang Wu
Alexander H. Miller
Sebastian Riedel
KELM
AI4MH
419
2,588
0
03 Sep 2019
Previous
1
2