Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2410.23771
Cited By
What is Wrong with Perplexity for Long-context Language Modeling?
31 October 2024
Lizhe Fang
Yifei Wang
Zhaoyang Liu
Chenheng Zhang
Stefanie Jegelka
Jinyang Gao
Bolin Ding
Yisen Wang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"What is Wrong with Perplexity for Long-context Language Modeling?"
3 / 3 papers shown
Title
RWKV-X: A Linear Complexity Hybrid Language Model
Haowen Hou
Zhiyi Huang
Kaifeng Tan
Rongchang Lu
Fei Richard Yu
VLM
78
0
0
30 Apr 2025
ConSens: Assessing context grounding in open-book question answering
Ivan Vankov
Matyo Ivanov
Adriana Correia
Victor Botev
ELM
58
0
0
30 Apr 2025
When Precision Meets Position: BFloat16 Breaks Down RoPE in Long-Context Training
Haonan Wang
Qian Liu
Chao Du
Tongyao Zhu
Cunxiao Du
Kenji Kawaguchi
Tianyu Pang
71
5
0
20 Nov 2024
1