ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.06231
  4. Cited By
Quantization-Guided Training for Compact TinyML Models

Quantization-Guided Training for Compact TinyML Models

10 March 2021
Sedigh Ghamari
Koray Ozcan
Thu Dinh
A. Melnikov
Juan Carvajal
Jan Ernst
S. Chai
    MQ
ArXiv (abs)PDFHTML

Papers citing "Quantization-Guided Training for Compact TinyML Models"

6 / 6 papers shown
Title
Quantizing Small-Scale State-Space Models for Edge AI
Quantizing Small-Scale State-Space Models for Edge AI
Leo Zhao
Tristan Torchet
Melika Payvand
Laura Kriener
Filippo Moro
MQ
94
2
0
14 Jun 2025
Deep learning model compression using network sensitivity and gradients
Deep learning model compression using network sensitivity and gradients
M. Sakthi
N. Yadla
Raj Pawate
82
2
0
11 Oct 2022
Overcoming Oscillations in Quantization-Aware Training
Overcoming Oscillations in Quantization-Aware Training
Markus Nagel
Marios Fournarakis
Yelysei Bondarenko
Tijmen Blankevoort
MQ
273
125
0
21 Mar 2022
An Empirical Study of Low Precision Quantization for TinyML
An Empirical Study of Low Precision Quantization for TinyML
Shaojie Zhuo
Hongyu Chen
R. Ramakrishnan
Tommy Chen
Chen Feng
Yi-Rung Lin
Parker Zhang
Liang Shen
MQ
179
18
0
10 Mar 2022
Implicit Neural Representations for Image Compression
Implicit Neural Representations for Image Compression
Yannick Strümpler
Janis Postels
Ren Yang
Luc van Gool
F. Tombari
179
176
0
08 Dec 2021
A TinyML Platform for On-Device Continual Learning with Quantized Latent
  Replays
A TinyML Platform for On-Device Continual Learning with Quantized Latent Replays
Leonardo Ravaglia
Manuele Rusci
D. Nadalini
Alessandro Capotondi
Francesco Conti
Luca Benini
BDL
144
77
0
20 Oct 2021
1