ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.09657
  4. Cited By
Online Distillation for Pseudo-Relevance Feedback

Online Distillation for Pseudo-Relevance Feedback

16 June 2023
Sean MacAvaney
Xi Wang
ArXivPDFHTML

Papers citing "Online Distillation for Pseudo-Relevance Feedback"

6 / 6 papers shown
Title
On Precomputation and Caching in Information Retrieval Experiments with Pipeline Architectures
On Precomputation and Caching in Information Retrieval Experiments with Pipeline Architectures
Sean MacAvaney
Craig Macdonald
29
1
0
14 Apr 2025
Breaking the Lens of the Telescope: Online Relevance Estimation over Large Retrieval Sets
Breaking the Lens of the Telescope: Online Relevance Estimation over Large Retrieval Sets
Mandeep Rathee
Venktesh V
Sean MacAvaney
Avishek Anand
KELM
18
1
0
12 Apr 2025
SPLADE v2: Sparse Lexical and Expansion Model for Information Retrieval
SPLADE v2: Sparse Lexical and Expansion Model for Information Retrieval
Thibault Formal
Carlos Lassance
Benjamin Piwowarski
S. Clinchant
194
186
0
21 Sep 2021
Overview of the TREC 2020 deep learning track
Overview of the TREC 2020 deep learning track
Nick Craswell
Bhaskar Mitra
Emine Yilmaz
Daniel Fernando Campos
54
368
0
15 Feb 2021
LRC-BERT: Latent-representation Contrastive Knowledge Distillation for
  Natural Language Understanding
LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding
Hao Fu
Shaojun Zhou
Qihong Yang
Junjie Tang
Guiquan Liu
Kaikui Liu
Xiaolong Li
27
56
0
14 Dec 2020
Overview of the TREC 2019 deep learning track
Overview of the TREC 2019 deep learning track
Nick Craswell
Bhaskar Mitra
Emine Yilmaz
Daniel Fernando Campos
E. Voorhees
174
463
0
17 Mar 2020
1