ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.03347
  4. Cited By
Distilling Universal and Joint Knowledge for Cross-Domain Model
  Compression on Time Series Data

Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data

7 July 2023
Qing Xu
Min-man Wu
Xiaoli Li
K. Mao
Zhenghua Chen
ArXivPDFHTML

Papers citing "Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data"

3 / 3 papers shown
Title
Finding Foundation Models for Time Series Classification with a PreText
  Task
Finding Foundation Models for Time Series Classification with a PreText Task
Ali Ismail-Fawaz
Maxime Devanne
Stefano Berretti
Jonathan Weber
Germain Forestier
21
5
0
24 Nov 2023
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
Tailin Liang
C. Glossner
Lei Wang
Shaobo Shi
Xiaotong Zhang
MQ
124
665
0
24 Jan 2021
Domain-Adversarial Training of Neural Networks
Domain-Adversarial Training of Neural Networks
Yaroslav Ganin
E. Ustinova
Hana Ajakan
Pascal Germain
Hugo Larochelle
François Laviolette
M. Marchand
Victor Lempitsky
GAN
OOD
149
9,300
0
28 May 2015
1