Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2004.14074
Cited By
Pre-training Is (Almost) All You Need: An Application to Commonsense Reasoning
29 April 2020
Alexandre Tamborrino
Nicola Pellicanò
B. Pannier
Pascal Voitot
Louise Naudin
LRM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Pre-training Is (Almost) All You Need: An Application to Commonsense Reasoning"
9 / 9 papers shown
Title
VLIS: Unimodal Language Models Guide Multimodal Language Generation
Jiwan Chung
Youngjae Yu
VLM
22
1
0
15 Oct 2023
Multi-hop Commonsense Knowledge Injection Framework for Zero-Shot Commonsense Question Answering
Xin Guan
Biwei Cao
Qingqing Gao
Zheng Yin
Bo Liu
Jiuxin Cao
18
5
0
10 May 2023
Natural Language Reasoning, A Survey
Fei Yu
Hongbo Zhang
Prayag Tiwari
Benyou Wang
ReLM
LRM
28
49
0
26 Mar 2023
Evaluate Confidence Instead of Perplexity for Zero-shot Commonsense Reasoning
Letian Peng
Z. Li
Hai Zhao
ReLM
LRM
16
1
0
23 Aug 2022
LogiGAN: Learning Logical Reasoning via Adversarial Pre-training
Xinyu Pi
Wanjun Zhong
Yan Gao
Nan Duan
Jian-Guang Lou
NAI
GAN
LRM
AI4CE
31
16
0
18 May 2022
Rethinking Why Intermediate-Task Fine-Tuning Works
Ting-Yun Chang
Chi-Jen Lu
LRM
19
29
0
26 Aug 2021
REPT: Bridging Language Models and Machine Reading Comprehension via Retrieval-Based Pre-training
Fangkai Jiao
Yangyang Guo
Yilin Niu
Feng Ji
Feng-Lin Li
Liqiang Nie
LRM
26
12
0
10 May 2021
Is Incoherence Surprising? Targeted Evaluation of Coherence Prediction from Language Models
Anne Beyer
Sharid Loáiciga
David Schlangen
13
15
0
07 May 2021
Relational World Knowledge Representation in Contextual Language Models: A Review
Tara Safavi
Danai Koutra
KELM
27
51
0
12 Apr 2021
1