ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.19668
  4. Cited By
SQUAT: Stateful Quantization-Aware Training in Recurrent Spiking Neural
  Networks

SQUAT: Stateful Quantization-Aware Training in Recurrent Spiking Neural Networks

15 April 2024
Sreyes P. Venkatesh
Razvan Marinescu
Nhan Duy Truong
    MQ
ArXiv (abs)PDFHTMLGithub (1599★)

Papers citing "SQUAT: Stateful Quantization-Aware Training in Recurrent Spiking Neural Networks"

3 / 3 papers shown
Reducing Data Bottlenecks in Distributed, Heterogeneous Neural Networks
Reducing Data Bottlenecks in Distributed, Heterogeneous Neural NetworksInternational Symposium on Embedded Multicore/Many-core Systems-on-Chip (MCSoC), 2024
Ruhai Lin
Rui-Jie Zhu
Nhan Duy Truong
190
1
0
12 Oct 2024
Scalable MatMul-free Language Modeling
Scalable MatMul-free Language Modeling
Rui-Jie Zhu
Yu Zhang
Ethan Sifferman
Tyler Sheaves
Yiqiao Wang
Dustin Richmond
P. Zhou
Nhan Duy Truong
545
32
0
04 Jun 2024
SpikeGPT: Generative Pre-trained Language Model with Spiking Neural
  Networks
SpikeGPT: Generative Pre-trained Language Model with Spiking Neural Networks
Rui-Jie Zhu
Qihang Zhao
Guoqi Li
Nhan Duy Truong
BDLVLM
427
115
0
27 Feb 2023
1