ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.11611
  4. Cited By
In defense of parameter sharing for model-compression

In defense of parameter sharing for model-compression

17 October 2023
Aditya Desai
Anshumali Shrivastava
ArXivPDFHTML

Papers citing "In defense of parameter sharing for model-compression"

3 / 3 papers shown
Title
Changing Base Without Losing Pace: A GPU-Efficient Alternative to MatMul in DNNs
Changing Base Without Losing Pace: A GPU-Efficient Alternative to MatMul in DNNs
Nir Ailon
Akhiad Bercovich
Omri Weinstein
62
0
0
15 Mar 2025
Meta-Sparsity: Learning Optimal Sparse Structures in Multi-task Networks through Meta-learning
Meta-Sparsity: Learning Optimal Sparse Structures in Multi-task Networks through Meta-learning
Richa Upadhyay
Ronald Phlypo
Rajkumar Saini
Marcus Liwicki
42
0
0
21 Jan 2025
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,613
0
17 Apr 2017
1