ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.03014
  4. Cited By
BasisNet: Two-stage Model Synthesis for Efficient Inference

BasisNet: Two-stage Model Synthesis for Efficient Inference

7 May 2021
Mingda Zhang
Chun-Te Chu
A. Zhmoginov
Andrew G. Howard
Brendan Jou
Yukun Zhu
Li Zhang
R. Hwa
Adriana Kovashka
    3DH
ArXivPDFHTML

Papers citing "BasisNet: Two-stage Model Synthesis for Efficient Inference"

6 / 6 papers shown
Title
ORXE: Orchestrating Experts for Dynamically Configurable Efficiency
ORXE: Orchestrating Experts for Dynamically Configurable Efficiency
Qingyuan Wang
Guoxin Wang
B. Cardiff
Deepu John
38
0
0
07 May 2025
Tiny Models are the Computational Saver for Large Models
Tiny Models are the Computational Saver for Large Models
Qingyuan Wang
B. Cardiff
Antoine Frappé
Benoît Larras
Deepu John
29
2
0
26 Mar 2024
Soft Merging of Experts with Adaptive Routing
Soft Merging of Experts with Adaptive Routing
Mohammed Muqeeth
Haokun Liu
Colin Raffel
MoMe
MoE
27
45
0
06 Jun 2023
Federated Learning of Shareable Bases for Personalization-Friendly Image
  Classification
Federated Learning of Shareable Bases for Personalization-Friendly Image Classification
Hong-You Chen
Jike Zhong
Mingda Zhang
Xuhui Jia
Qi
Boqing Gong
Wei-Lun Chao
Li Zhang
FedML
30
6
0
16 Apr 2023
Scaling Up Your Kernels to 31x31: Revisiting Large Kernel Design in CNNs
Scaling Up Your Kernels to 31x31: Revisiting Large Kernel Design in CNNs
Xiaohan Ding
X. Zhang
Yi Zhou
Jungong Han
Guiguang Ding
Jian-jun Sun
VLM
47
528
0
13 Mar 2022
Collaboration of Experts: Achieving 80% Top-1 Accuracy on ImageNet with
  100M FLOPs
Collaboration of Experts: Achieving 80% Top-1 Accuracy on ImageNet with 100M FLOPs
Yikang Zhang
Zhuo Chen
Zhaobai Zhong
MoE
22
8
0
08 Jul 2021
1