ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.06644
  4. Cited By
Spark Transformer: Reactivating Sparsity in FFN and Attention
v1v2 (latest)

Spark Transformer: Reactivating Sparsity in FFN and Attention

7 June 2025
Chong You
Kan Wu
Zhipeng Jia
Lin Chen
Srinadh Bhojanapalli
Jiaxian Guo
Utku Evci
Jan Wassenberg
Praneeth Netrapalli
Jeremiah J. Willcock
Suvinay Subramanian
Felix Chern
Alek Andreev
Shreya Pathak
Felix X. Yu
Prateek Jain
David Culler
Henry M. Levy
Sanjiv Kumar
ArXiv (abs)PDFHTMLGithub

Papers citing "Spark Transformer: Reactivating Sparsity in FFN and Attention"

2 / 2 papers shown
Universal Properties of Activation Sparsity in Modern Large Language Models
Universal Properties of Activation Sparsity in Modern Large Language Models
Filip Szatkowski
Patryk Bedkowski
Alessio Devoto
Jan Dubiñski
Pasquale Minervini
Mikołaj Piórczyński
Simone Scardapane
Bartosz Wójcik
167
1
0
30 Aug 2025
GLASS: Test-Time Acceleration for LLMs via Global-Local Neural Importance Aggregation
GLASS: Test-Time Acceleration for LLMs via Global-Local Neural Importance Aggregation
Amirmohsen Sattarifard
Sepehr Lavasani
Ehsan Imani
Kunlin Zhang
Hanlin Xu
Fengyu Sun
Negar Hassanpour
Chao Gao
VLM
104
1
0
19 Aug 2025
1
Page 1 of 1