Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2309.16540
Cited By
Unsupervised Pretraining for Fact Verification by Language Model Distillation
28 September 2023
A. Bazaga
Pietro Lió
Bo Dai
HILM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Unsupervised Pretraining for Fact Verification by Language Model Distillation"
6 / 6 papers shown
Title
ReAct: Synergizing Reasoning and Acting in Language Models
Shunyu Yao
Jeffrey Zhao
Dian Yu
Nan Du
Izhak Shafran
Karthik Narasimhan
Yuan Cao
LLMAG
ReLM
LRM
223
2,413
0
06 Oct 2022
Generate rather than Retrieve: Large Language Models are Strong Context Generators
W. Yu
Dan Iter
Shuohang Wang
Yichong Xu
Mingxuan Ju
Soumya Sanyal
Chenguang Zhu
Michael Zeng
Meng-Long Jiang
RALM
AIMat
215
318
0
21 Sep 2022
Unsupervised Semantic Segmentation by Contrasting Object Mask Proposals
Wouter Van Gansbeke
Simon Vandenhende
Stamatios Georgoulis
Luc Van Gool
SSL
185
247
0
11 Feb 2021
SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
231
186
0
12 Jan 2021
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
235
3,029
0
09 Mar 2020
Reasoning Over Semantic-Level Graph for Fact Checking
Wanjun Zhong
Jingjing Xu
Duyu Tang
Zenan Xu
Nan Duan
M. Zhou
Jiahai Wang
Jian Yin
HILM
GNN
175
163
0
09 Sep 2019
1