Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2311.05472
Cited By
Text Representation Distillation via Information Bottleneck Principle
9 November 2023
Yanzhao Zhang
Dingkun Long
Zehan Li
Pengjun Xie
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Text Representation Distillation via Information Bottleneck Principle"
2 / 2 papers shown
Title
RetroMAE: Pre-Training Retrieval-oriented Language Models Via Masked Auto-Encoder
Shitao Xiao
Zheng Liu
Yingxia Shao
Zhao Cao
RALM
118
109
0
24 May 2022
Unsupervised Corpus Aware Language Model Pre-training for Dense Passage Retrieval
Luyu Gao
Jamie Callan
RALM
157
329
0
12 Aug 2021
1