ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2508.07185
  4. Cited By
DySK-Attn: A Framework for Efficient, Real-Time Knowledge Updating in Large Language Models via Dynamic Sparse Knowledge Attention
v1v2v3 (latest)

DySK-Attn: A Framework for Efficient, Real-Time Knowledge Updating in Large Language Models via Dynamic Sparse Knowledge Attention

10 August 2025
Kabir Khan
Priya Sharma
Arjun Mehta
Neha Gupta
Ravi Narayanan
    KELM
ArXiv (abs)PDFHTML

Papers citing "DySK-Attn: A Framework for Efficient, Real-Time Knowledge Updating in Large Language Models via Dynamic Sparse Knowledge Attention"

1 / 1 papers shown
KnowledgeSmith: Uncovering Knowledge Updating in LLMs with Model Editing and Unlearning
KnowledgeSmith: Uncovering Knowledge Updating in LLMs with Model Editing and Unlearning
Yinyi Luo
Z. Zhou
Hao Chen
Kai Qiu
Marios Savvides
Shouqing Yang
James Evans
KELMMU
189
0
0
01 Oct 2025
1