ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.16248
  4. Cited By
URL: Universal Referential Knowledge Linking via Task-instructed
  Representation Compression

URL: Universal Referential Knowledge Linking via Task-instructed Representation Compression

24 April 2024
Zhuoqun Li
Hongyu Lin
Tianshu Wang
Boxi Cao
Yaojie Lu
Weixiang Zhou
Hao Wang
Zhenyu Zeng
Le Sun
Xianpei Han
ArXivPDFHTML

Papers citing "URL: Universal Referential Knowledge Linking via Task-instructed Representation Compression"

4 / 4 papers shown
Title
Making Large Language Models A Better Foundation For Dense Retrieval
Making Large Language Models A Better Foundation For Dense Retrieval
Chaofan Li
Zheng Liu
Shitao Xiao
Yingxia Shao
RALM
30
34
0
24 Dec 2023
Language Model Alignment with Elastic Reset
Language Model Alignment with Elastic Reset
Michael Noukhovitch
Samuel Lavoie
Florian Strub
Aaron Courville
KELM
87
25
0
06 Dec 2023
RetroMAE: Pre-Training Retrieval-oriented Language Models Via Masked
  Auto-Encoder
RetroMAE: Pre-Training Retrieval-oriented Language Models Via Masked Auto-Encoder
Shitao Xiao
Zheng Liu
Yingxia Shao
Zhao Cao
RALM
115
109
0
24 May 2022
BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information
  Retrieval Models
BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models
Nandan Thakur
Nils Reimers
Andreas Rucklé
Abhishek Srivastava
Iryna Gurevych
VLM
229
961
0
17 Apr 2021
1