ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.12035
  4. Cited By
RetroMAE: Pre-Training Retrieval-oriented Language Models Via Masked
  Auto-Encoder

RetroMAE: Pre-Training Retrieval-oriented Language Models Via Masked Auto-Encoder

24 May 2022
Shitao Xiao
Zheng Liu
Yingxia Shao
Zhao Cao
    RALM
ArXivPDFHTML

Papers citing "RetroMAE: Pre-Training Retrieval-oriented Language Models Via Masked Auto-Encoder"

13 / 13 papers shown
Title
Unsupervised Corpus Poisoning Attacks in Continuous Space for Dense Retrieval
Unsupervised Corpus Poisoning Attacks in Continuous Space for Dense Retrieval
Yongkang Li
Panagiotis Eustratiadis
Simon Lupart
Evangelos Kanoulas
AAML
43
0
0
24 Apr 2025
Unleashing the Power of LLMs in Dense Retrieval with Query Likelihood Modeling
Unleashing the Power of LLMs in Dense Retrieval with Query Likelihood Modeling
Hengran Zhang
Keping Bi
J. Guo
Xiaojie Sun
Shihao Liu
Daiting Shi
Dawei Yin
Xueqi Cheng
RALM
29
0
0
07 Apr 2025
Universal Item Tokenization for Transferable Generative Recommendation
Universal Item Tokenization for Transferable Generative Recommendation
Bowen Zheng
Hongyu Lu
Yu Chen
Wayne Xin Zhao
Ji-Rong Wen
14
0
0
06 Apr 2025
Hypencoder: Hypernetworks for Information Retrieval
Hypencoder: Hypernetworks for Information Retrieval
Julian Killingback
Hansi Zeng
Hamed Zamani
93
1
0
07 Feb 2025
Does Self-Attention Need Separate Weights in Transformers?
Md. Kowsher
Nusrat Jahan Prottasha
Chun-Nam Yu
O. Garibay
Niloofar Yousefi
85
0
0
30 Nov 2024
CoIR: A Comprehensive Benchmark for Code Information Retrieval Models
CoIR: A Comprehensive Benchmark for Code Information Retrieval Models
Xiangyang Li
Kuicai Dong
Yi Quan Lee
Wei Xia
Yichun Yin
Xinyi Dai
Yasheng Wang
Ruiming Tang
18
13
0
03 Jul 2024
ERNIE-Search: Bridging Cross-Encoder with Dual-Encoder via Self
  On-the-fly Distillation for Dense Passage Retrieval
ERNIE-Search: Bridging Cross-Encoder with Dual-Encoder via Self On-the-fly Distillation for Dense Passage Retrieval
Yuxiang Lu
Yiding Liu
Jiaxiang Liu
Yunsheng Shi
Zhengjie Huang
...
Hao Tian
Hua-Hong Wu
Shuaiqiang Wang
Dawei Yin
Haifeng Wang
102
52
0
18 May 2022
RocketQAv2: A Joint Training Method for Dense Passage Retrieval and
  Passage Re-ranking
RocketQAv2: A Joint Training Method for Dense Passage Retrieval and Passage Re-ranking
Ruiyang Ren
Yingqi Qu
Jing Liu
Wayne Xin Zhao
Qiaoqiao She
Hua-Hong Wu
Haifeng Wang
Ji-Rong Wen
119
210
0
14 Oct 2021
Unsupervised Corpus Aware Language Model Pre-training for Dense Passage
  Retrieval
Unsupervised Corpus Aware Language Model Pre-training for Dense Passage Retrieval
Luyu Gao
Jamie Callan
RALM
136
285
0
12 Aug 2021
BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information
  Retrieval Models
BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models
Nandan Thakur
Nils Reimers
Andreas Rucklé
Abhishek Srivastava
Iryna Gurevych
VLM
229
720
0
17 Apr 2021
Training Large-Scale News Recommenders with Pretrained Language Models
  in the Loop
Training Large-Scale News Recommenders with Pretrained Language Models in the Loop
Shitao Xiao
Zheng Liu
Yingxia Shao
Tao Di
Xing Xie
VLM
AIFin
80
32
0
18 Feb 2021
RocketQA: An Optimized Training Approach to Dense Passage Retrieval for
  Open-Domain Question Answering
RocketQA: An Optimized Training Approach to Dense Passage Retrieval for Open-Domain Question Answering
Yingqi Qu
Yuchen Ding
Jing Liu
Kai Liu
Ruiyang Ren
Xin Zhao
Daxiang Dong
Hua-Hong Wu
Haifeng Wang
RALM
OffRL
189
516
0
16 Oct 2020
Pretrained Transformers for Text Ranking: BERT and Beyond
Pretrained Transformers for Text Ranking: BERT and Beyond
Jimmy J. Lin
Rodrigo Nogueira
Andrew Yates
VLM
203
515
0
13 Oct 2020
1