ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.13282
  4. Cited By
Understanding the RoPE Extensions of Long-Context LLMs: An Attention
  Perspective

Understanding the RoPE Extensions of Long-Context LLMs: An Attention Perspective

19 June 2024
M. Zhong
Chen Zhang
Yikun Lei
Xikai Liu
Yan Gao
Yao Hu
Kehai Chen
Min Zhang
ArXivPDFHTML

Papers citing "Understanding the RoPE Extensions of Long-Context LLMs: An Attention Perspective"

3 / 3 papers shown
Title
The Power of Personality: A Human Simulation Perspective to Investigate Large Language Model Agents
The Power of Personality: A Human Simulation Perspective to Investigate Large Language Model Agents
Yifan Duan
Yihong Tang
Xuefeng Bai
Kehai Chen
J. Li
Min Zhang
LLMAG
77
0
0
28 Feb 2025
Large Language Models: A Survey
Large Language Models: A Survey
Shervin Minaee
Tomáš Mikolov
Narjes Nikzad
M. Asgari-Chenaghlu
R. Socher
Xavier Amatriain
Jianfeng Gao
ALM
LM&MA
ELM
102
347
0
09 Feb 2024
DeepSeek LLM: Scaling Open-Source Language Models with Longtermism
DeepSeek LLM: Scaling Open-Source Language Models with Longtermism
DeepSeek-AI Xiao Bi
:
Xiao Bi
Deli Chen
Guanting Chen
...
Yao Zhao
Shangyan Zhou
Shunfeng Zhou
Qihao Zhu
Yuheng Zou
LRM
ALM
131
298
0
05 Jan 2024
1