Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2311.15564
Cited By
Boot and Switch: Alternating Distillation for Zero-Shot Dense Retrieval
27 November 2023
Fan Jiang
Qiongkai Xu
Tom Drummond
Trevor Cohn
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Boot and Switch: Alternating Distillation for Zero-Shot Dense Retrieval"
10 / 10 papers shown
Title
ConceptCarve: Dynamic Realization of Evidence
Eylon Caplan
Dan Goldwasser
21
0
0
09 Apr 2025
PairDistill: Pairwise Relevance Distillation for Dense Retrieval
Chao-Wei Huang
Yun-Nung Chen
28
1
0
02 Oct 2024
RetroMAE: Pre-Training Retrieval-oriented Language Models Via Masked Auto-Encoder
Shitao Xiao
Zheng Liu
Yingxia Shao
Zhao Cao
RALM
118
109
0
24 May 2022
EncT5: A Framework for Fine-tuning T5 as Non-autoregressive Models
Frederick Liu
T. Huang
Shihang Lyu
Siamak Shakeri
Hongkun Yu
Jing Li
31
8
0
16 Oct 2021
Salient Phrase Aware Dense Retrieval: Can a Dense Retriever Imitate a Sparse One?
Xilun Chen
Kushal Lakhotia
Barlas Oğuz
Anchit Gupta
Patrick Lewis
Stanislav Peshterliev
Yashar Mehdad
Sonal Gupta
Wen-tau Yih
48
68
0
13 Oct 2021
SPLADE v2: Sparse Lexical and Expansion Model for Information Retrieval
Thibault Formal
Carlos Lassance
Benjamin Piwowarski
S. Clinchant
194
186
0
21 Sep 2021
Unsupervised Corpus Aware Language Model Pre-training for Dense Passage Retrieval
Luyu Gao
Jamie Callan
RALM
155
329
0
12 Aug 2021
BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models
Nandan Thakur
Nils Reimers
Andreas Rucklé
Abhishek Srivastava
Iryna Gurevych
VLM
229
964
0
17 Apr 2021
RocketQA: An Optimized Training Approach to Dense Passage Retrieval for Open-Domain Question Answering
Yingqi Qu
Yuchen Ding
Jing Liu
Kai Liu
Ruiyang Ren
Xin Zhao
Daxiang Dong
Hua-Hong Wu
Haifeng Wang
RALM
OffRL
209
593
0
16 Oct 2020
Revisiting Self-Training for Neural Sequence Generation
Junxian He
Jiatao Gu
Jiajun Shen
MarcÁurelio Ranzato
SSL
LRM
242
269
0
30 Sep 2019
1