Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2406.06110
Cited By
Recurrent Context Compression: Efficiently Expanding the Context Window of LLM
10 June 2024
Chensen Huang
Guibo Zhu
Xuepeng Wang
Yifei Luo
Guojing Ge
Haoran Chen
Dong Yi
Jinqiao Wang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Recurrent Context Compression: Efficiently Expanding the Context Window of LLM"
4 / 4 papers shown
Title
Bridging Context Gaps: Leveraging Coreference Resolution for Long Contextual Understanding
Yanming Liu
Xinyue Peng
Jiannan Cao
Shi Bo
Yanxin Shen
Tianyu Du
Sheng Cheng
Xun Wang
Jianwei Yin
Xuhong Zhang
48
9
0
02 Oct 2024
Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention
Tsendsuren Munkhdalai
Manaal Faruqui
Siddharth Gopal
LRM
LLMAG
CLL
81
101
0
10 Apr 2024
LongLLMLingua: Accelerating and Enhancing LLMs in Long Context Scenarios via Prompt Compression
Huiqiang Jiang
Qianhui Wu
Xufang Luo
Dongsheng Li
Chin-Yew Lin
Yuqing Yang
Lili Qiu
RALM
96
179
0
10 Oct 2023
The Pile: An 800GB Dataset of Diverse Text for Language Modeling
Leo Gao
Stella Biderman
Sid Black
Laurence Golding
Travis Hoppe
...
Horace He
Anish Thite
Noa Nabeshima
Shawn Presser
Connor Leahy
AIMat
242
1,977
0
31 Dec 2020
1