ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.01135
  4. Cited By
Arch-Net: Model Distillation for Architecture Agnostic Model Deployment
v1v2 (latest)

Arch-Net: Model Distillation for Architecture Agnostic Model Deployment

1 November 2021
Weixin Xu
Zipeng Feng
Shuangkang Fang
Song Yuan
Yi Yang
Shuchang Zhou
    MQ
ArXiv (abs)PDFHTMLGithub (22★)

Papers citing "Arch-Net: Model Distillation for Architecture Agnostic Model Deployment"

1 / 1 papers shown
Title
One is All: Bridging the Gap Between Neural Radiance Fields
  Architectures with Progressive Volume Distillation
One is All: Bridging the Gap Between Neural Radiance Fields Architectures with Progressive Volume DistillationAAAI Conference on Artificial Intelligence (AAAI), 2022
Shuangkang Fang
Weixin Xu
Heng Wang
Yi Yang
Yu-feng Wang
Shuchang Zhou
164
19
0
29 Nov 2022
1