ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2207.05432
  4. Cited By
Synergistic Self-supervised and Quantization Learning

Synergistic Self-supervised and Quantization Learning

European Conference on Computer Vision (ECCV), 2022
12 July 2022
Yunhao Cao
Peiqin Sun
Yechang Huang
Jianxin Wu
Shuchang Zhou
    MQ
ArXiv (abs)PDFHTMLGithub (73★)

Papers citing "Synergistic Self-supervised and Quantization Learning"

10 / 10 papers shown
MEC-Quant: Maximum Entropy Coding for Extremely Low Bit Quantization-Aware Training
MEC-Quant: Maximum Entropy Coding for Extremely Low Bit Quantization-Aware Training
Junbiao Pang
Tianyang Cai
Baochang Zhang
MQ
124
0
0
19 Sep 2025
AIRCHITECT v2: Learning the Hardware Accelerator Design Space through Unified Representations
AIRCHITECT v2: Learning the Hardware Accelerator Design Space through Unified RepresentationsDesign, Automation and Test in Europe (DATE), 2025
Jamin Seo
Akshat Ramachandran
Yu-Chuan Chuang
Anirudh Itagi
Tushar Krishna
AI4CE
311
3
0
20 Jan 2025
CLAMP-ViT: Contrastive Data-Free Learning for Adaptive Post-Training
  Quantization of ViTs
CLAMP-ViT: Contrastive Data-Free Learning for Adaptive Post-Training Quantization of ViTs
Akshat Ramachandran
Souvik Kundu
Tushar Krishna
MQ
298
19
0
07 Jul 2024
On Improving the Algorithm-, Model-, and Data- Efficiency of
  Self-Supervised Learning
On Improving the Algorithm-, Model-, and Data- Efficiency of Self-Supervised Learning
Yunhao Cao
Jianxin Wu
205
1
0
30 Apr 2024
Fed-QSSL: A Framework for Personalized Federated Learning under Bitwidth
  and Data Heterogeneity
Fed-QSSL: A Framework for Personalized Federated Learning under Bitwidth and Data Heterogeneity
Yiyue Chen
H. Vikalo
C. Wang
FedML
226
13
0
20 Dec 2023
Jumping through Local Minima: Quantization in the Loss Landscape of
  Vision Transformers
Jumping through Local Minima: Quantization in the Loss Landscape of Vision TransformersIEEE International Conference on Computer Vision (ICCV), 2023
N. Frumkin
Dibakar Gope
Diana Marculescu
MQ
278
19
0
21 Aug 2023
Three Guidelines You Should Know for Universally Slimmable
  Self-Supervised Learning
Three Guidelines You Should Know for Universally Slimmable Self-Supervised LearningComputer Vision and Pattern Recognition (CVPR), 2023
Yunhao Cao
Peiqin Sun
Shuchang Zhou
111
5
0
13 Mar 2023
Randomized Quantization: A Generic Augmentation for Data Agnostic
  Self-supervised Learning
Randomized Quantization: A Generic Augmentation for Data Agnostic Self-supervised LearningIEEE International Conference on Computer Vision (ICCV), 2022
Huimin Wu
Chenyang Lei
Xiao Sun
Pengju Wang
Qifeng Chen
Kwang-Ting Cheng
Stephen Lin
Zhirong Wu
MQ
259
9
0
19 Dec 2022
CPT-V: A Contrastive Approach to Post-Training Quantization of Vision
  Transformers
CPT-V: A Contrastive Approach to Post-Training Quantization of Vision Transformers
N. Frumkin
Dibakar Gope
Diana Marculescu
ViTMQ
320
1
0
17 Nov 2022
Unsupervised Learning of Visual Features by Contrasting Cluster
  Assignments
Unsupervised Learning of Visual Features by Contrasting Cluster Assignments
Mathilde Caron
Ishan Misra
Julien Mairal
Priya Goyal
Piotr Bojanowski
Armand Joulin
OCLSSL
1.2K
4,664
0
17 Jun 2020
1