ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.15134
  4. Cited By
SwiftLearn: A Data-Efficient Training Method of Deep Learning Models
  using Importance Sampling

SwiftLearn: A Data-Efficient Training Method of Deep Learning Models using Importance Sampling

25 November 2023
Habib Hajimolahoseini
Omar Mohamed Awad
Walid Ahmed
Austin Wen
Saina Asani
Mohammad Hassanpour
Farnoosh Javadi
Mehdi Ahmadi
Foozhan Ataiefard
Kangling Liu
Yang Liu
ArXiv (abs)PDFHTMLGithub

Papers citing "SwiftLearn: A Data-Efficient Training Method of Deep Learning Models using Importance Sampling"

4 / 4 papers shown
Slamming: Training a Speech Language Model on One GPU in a Day
Slamming: Training a Speech Language Model on One GPU in a DayAnnual Meeting of the Association for Computational Linguistics (ACL), 2025
Gallil Maimon
Avishai Elmakies
Yossi Adi
400
12
0
19 Feb 2025
Is 3D Convolution with 5D Tensors Really Necessary for Video Analysis?
Is 3D Convolution with 5D Tensors Really Necessary for Video Analysis?
Habib Hajimolahoseini
Walid Ahmed
Austin Wen
Yang Liu
309
0
0
23 Jul 2024
Improving Resnet-9 Generalization Trained on Small Datasets
Improving Resnet-9 Generalization Trained on Small Datasets
Omar Mohamed Awad
Habib Hajimolahoseini
Michael Lim
Gurpreet Gosal
Walid Ahmed
Yang Liu
Gordon Deng
293
2
0
07 Sep 2023
Training Acceleration of Low-Rank Decomposed Networks using Sequential Freezing and Rank Quantization
Training Acceleration of Low-Rank Decomposed Networks using Sequential Freezing and Rank Quantization
Habib Hajimolahoseini
Walid Ahmed
Yang Liu
OffRLMQ
243
10
0
07 Sep 2023
1
Page 1 of 1