ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.14578
  4. Cited By
Where Should We Begin? A Low-Level Exploration of Weight Initialization
  Impact on Quantized Behaviour of Deep Neural Networks

Where Should We Begin? A Low-Level Exploration of Weight Initialization Impact on Quantized Behaviour of Deep Neural Networks

30 November 2020
S. Yun
A. Wong
    MQ
ArXivPDFHTML

Papers citing "Where Should We Begin? A Low-Level Exploration of Weight Initialization Impact on Quantized Behaviour of Deep Neural Networks"

2 / 2 papers shown
Title
GHN-QAT: Training Graph Hypernetworks to Predict Quantization-Robust
  Parameters of Unseen Limited Precision Neural Networks
GHN-QAT: Training Graph Hypernetworks to Predict Quantization-Robust Parameters of Unseen Limited Precision Neural Networks
S. Yun
Alexander Wong
MQ
14
0
0
24 Sep 2023
Do All MobileNets Quantize Poorly? Gaining Insights into the Effect of
  Quantization on Depthwise Separable Convolutional Networks Through the Eyes
  of Multi-scale Distributional Dynamics
Do All MobileNets Quantize Poorly? Gaining Insights into the Effect of Quantization on Depthwise Separable Convolutional Networks Through the Eyes of Multi-scale Distributional Dynamics
S. Yun
Alexander Wong
MQ
27
25
0
24 Apr 2021
1