ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.15859
  4. Cited By
You can't pick your neighbors, or can you? When and how to rely on
  retrieval in the $k$NN-LM

You can't pick your neighbors, or can you? When and how to rely on retrieval in the kkkNN-LM

28 October 2022
Andrew Drozdov
Shufan Wang
Razieh Rahimi
Andrew McCallum
Hamed Zamani
Mohit Iyyer
    RALM
ArXivPDFHTML

Papers citing "You can't pick your neighbors, or can you? When and how to rely on retrieval in the $k$NN-LM"

7 / 7 papers shown
Title
Nearest Neighbor Speculative Decoding for LLM Generation and Attribution
Nearest Neighbor Speculative Decoding for LLM Generation and Attribution
Minghan Li
Xilun Chen
Ari Holtzman
Beidi Chen
Jimmy Lin
Wen-tau Yih
Xi Victoria Lin
RALM
BDL
105
10
0
29 May 2024
Training Language Models with Memory Augmentation
Training Language Models with Memory Augmentation
Zexuan Zhong
Tao Lei
Danqi Chen
RALM
221
126
0
25 May 2022
Chunk-based Nearest Neighbor Machine Translation
Chunk-based Nearest Neighbor Machine Translation
Pedro Henrique Martins
Zita Marinho
André F.T. Martins
RALM
65
25
0
24 May 2022
Neuro-Symbolic Language Modeling with Automaton-augmented Retrieval
Neuro-Symbolic Language Modeling with Automaton-augmented Retrieval
Uri Alon
Frank F. Xu
Junxian He
Sudipta Sengupta
Dan Roth
Graham Neubig
RALM
67
62
0
28 Jan 2022
Efficient Nearest Neighbor Language Models
Efficient Nearest Neighbor Language Models
Junxian He
Graham Neubig
Taylor Berg-Kirkpatrick
RALM
182
103
0
09 Sep 2021
Deduplicating Training Data Makes Language Models Better
Deduplicating Training Data Makes Language Models Better
Katherine Lee
Daphne Ippolito
A. Nystrom
Chiyuan Zhang
Douglas Eck
Chris Callison-Burch
Nicholas Carlini
SyDa
234
447
0
14 Jul 2021
Megatron-LM: Training Multi-Billion Parameter Language Models Using
  Model Parallelism
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
243
1,791
0
17 Sep 2019
1