ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.03275
  4. Cited By
Chipmunk: Training-Free Acceleration of Diffusion Transformers with Dynamic Column-Sparse Deltas

Chipmunk: Training-Free Acceleration of Diffusion Transformers with Dynamic Column-Sparse Deltas

3 June 2025
Austin Silveria
Soham V. Govande
Daniel Y. Fu
ArXiv (abs)PDFHTML

Papers citing "Chipmunk: Training-Free Acceleration of Diffusion Transformers with Dynamic Column-Sparse Deltas"

2 / 2 papers shown
Title
A Survey on Cache Methods in Diffusion Models: Toward Efficient Multi-Modal Generation
A Survey on Cache Methods in Diffusion Models: Toward Efficient Multi-Modal Generation
Jiacheng Liu
Xinyu Wang
Yuqi Lin
Zhikai Wang
P. Wang
...
Zexuan Yan
Zhengyi Shi
Chang Zou
Yue Ma
Linfeng Zhang
195
0
0
22 Oct 2025
Universal Properties of Activation Sparsity in Modern Large Language Models
Universal Properties of Activation Sparsity in Modern Large Language Models
Filip Szatkowski
Patryk Bedkowski
Alessio Devoto
Jan Dubiñski
Pasquale Minervini
Mikołaj Piórczyński
Simone Scardapane
Bartosz Wójcik
100
1
0
30 Aug 2025
1