ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.04662
  4. Cited By
DiSparse: Disentangled Sparsification for Multitask Model Compression

DiSparse: Disentangled Sparsification for Multitask Model Compression

9 June 2022
Xing Sun
Ali Hassani
Zhangyang Wang
Gao Huang
Humphrey Shi
ArXivPDFHTML

Papers citing "DiSparse: Disentangled Sparsification for Multitask Model Compression"

5 / 5 papers shown
Title
Advancing Weight and Channel Sparsification with Enhanced Saliency
Advancing Weight and Channel Sparsification with Enhanced Saliency
Xinglong Sun
Maying Shen
Hongxu Yin
Lei Mao
Pavlo Molchanov
Jose M. Alvarez
46
1
0
05 Feb 2025
AdapMTL: Adaptive Pruning Framework for Multitask Learning Model
AdapMTL: Adaptive Pruning Framework for Multitask Learning Model
Mingcan Xiang
Steven Jiaxun Tang
Qizheng Yang
Hui Guan
Tongping Liu
VLM
34
0
0
07 Aug 2024
Continual Learning through Networks Splitting and Merging with
  Dreaming-Meta-Weighted Model Fusion
Continual Learning through Networks Splitting and Merging with Dreaming-Meta-Weighted Model Fusion
Yi Sun
Xin Xu
Jian Li
Guanglei Xie
Yifei Shi
Qiang Fang
CLL
MoMe
24
1
0
12 Dec 2023
Towards Compute-Optimal Transfer Learning
Towards Compute-Optimal Transfer Learning
Massimo Caccia
Alexandre Galashov
Arthur Douillard
Amal Rannen-Triki
Dushyant Rao
Michela Paganini
Laurent Charlin
MarcÁurelio Ranzato
Razvan Pascanu
8
3
0
25 Apr 2023
Deep Elastic Networks with Model Selection for Multi-Task Learning
Deep Elastic Networks with Model Selection for Multi-Task Learning
Chanho Ahn
Eunwoo Kim
Songhwai Oh
49
49
0
11 Sep 2019
1