ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2308.07939
  4. Cited By
Ada-QPacknet -- adaptive pruning with bit width reduction as an
  efficient continual learning method without forgetting

Ada-QPacknet -- adaptive pruning with bit width reduction as an efficient continual learning method without forgetting

14 August 2023
Marcin Pietroñ
Dominik Zurek
Kamil Faber
Roberto Corizzo
    CLL
ArXivPDFHTML

Papers citing "Ada-QPacknet -- adaptive pruning with bit width reduction as an efficient continual learning method without forgetting"

4 / 4 papers shown
Title
Hadamard Domain Training with Integers for Class Incremental Quantized
  Learning
Hadamard Domain Training with Integers for Class Incremental Quantized Learning
Martin Schiemer
Clemens J. S. Schaefer
Jayden Parker Vap
Mark Horeni
Yu Emma Wang
Juan Ye
Siddharth Joshi
36
2
0
05 Oct 2023
Is Class-Incremental Enough for Continual Learning?
Is Class-Incremental Enough for Continual Learning?
Andrea Cossu
G. Graffieti
Lorenzo Pellegrini
Davide Maltoni
D. Bacciu
Antonio Carta
Vincenzo Lomonaco
CLL
40
30
0
06 Dec 2021
Adversarial Continual Learning
Adversarial Continual Learning
Sayna Ebrahimi
Franziska Meier
Roberto Calandra
Trevor Darrell
Marcus Rohrbach
CLL
VLM
152
197
0
21 Mar 2020
ImageNet Large Scale Visual Recognition Challenge
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
296
39,194
0
01 Sep 2014
1