Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1912.11960
Cited By
DeGAN : Data-Enriching GAN for Retrieving Representative Samples from a Trained Classifier
27 December 2019
Sravanti Addepalli
Gaurav Kumar Nayak
Anirban Chakraborty
R. Venkatesh Babu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"DeGAN : Data-Enriching GAN for Retrieving Representative Samples from a Trained Classifier"
21 / 21 papers shown
Title
FedRecon: Missing Modality Reconstruction in Distributed Heterogeneous Environments
Junming Liu
Guosun Zeng
Ding Wang
Yanting Gao
Yufei Jin
51
0
0
14 Apr 2025
Efficient Model Extraction via Boundary Sampling
Maor Biton Dor
Yisroel Mirsky
MLAU
MIACV
AAML
25
0
0
20 Oct 2024
CaBaGe: Data-Free Model Extraction using ClAss BAlanced Generator Ensemble
Jonathan Rosenthal
Shanchao Liang
Kevin Zhang
Lin Tan
MIACV
32
0
0
16 Sep 2024
Encapsulating Knowledge in One Prompt
Qi Li
Runpeng Yu
Xinchao Wang
VLM
KELM
52
3
0
16 Jul 2024
Towards Few-Call Model Stealing via Active Self-Paced Knowledge Distillation and Diffusion-Based Image Generation
Vlad Hondru
Radu Tudor Ionescu
DiffM
50
1
0
29 Sep 2023
Lion: Adversarial Distillation of Proprietary Large Language Models
Yuxin Jiang
Chunkit Chan
Mingyang Chen
Wei Wang
ALM
33
23
0
22 May 2023
Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation
Gaurav Patel
Konda Reddy Mopuri
Qiang Qiu
29
28
0
28 Feb 2023
Scalable Collaborative Learning via Representation Sharing
Frédéric Berdoz
Abhishek Singh
Martin Jaggi
Ramesh Raskar
FedML
30
3
0
20 Nov 2022
Self-distilled Knowledge Delegator for Exemplar-free Class Incremental Learning
Fanfan Ye
Liang Ma
Qiaoyong Zhong
Di Xie
Shiliang Pu
BDL
CLL
21
2
0
23 May 2022
Towards Data-Free Model Stealing in a Hard Label Setting
Sunandini Sanyal
Sravanti Addepalli
R. Venkatesh Babu
AAML
35
85
0
23 Apr 2022
Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Kuluhan Binici
Shivam Aggarwal
N. Pham
K. Leman
T. Mitra
TTA
27
45
0
09 Jan 2022
Data-Free Knowledge Transfer: A Survey
Yuang Liu
Wei Zhang
Jun Wang
Jianyong Wang
35
48
0
31 Dec 2021
DeepSteal: Advanced Model Extractions Leveraging Efficient Weight Stealing in Memories
Adnan Siraj Rakin
Md Hafizul Islam Chowdhuryy
Fan Yao
Deliang Fan
AAML
MIACV
42
110
0
08 Nov 2021
Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Kuluhan Binici
N. Pham
T. Mitra
K. Leman
30
40
0
11 Aug 2021
Data synthesis and adversarial networks: A review and meta-analysis in cancer imaging
Richard Osuala
Kaisar Kushibar
Lidia Garrucho
Akis Linardos
Zuzanna Szafranowska
Stefan Klein
Ben Glocker
Oliver Díaz
Karim Lekadir
MedIm
39
42
0
20 Jul 2021
Representation Consolidation for Training Expert Students
Zhizhong Li
Avinash Ravichandran
Charless C. Fowlkes
M. Polito
Rahul Bhotika
Stefano Soatto
16
6
0
16 Jul 2021
Domain Impression: A Source Data Free Domain Adaptation Method
V. Kurmi
Venkatesh Subramanian
Vinay P. Namboodiri
TTA
151
150
0
17 Feb 2021
Mining Data Impressions from Deep Models as Substitute for the Unavailable Training Data
Gaurav Kumar Nayak
Konda Reddy Mopuri
Saksham Jain
Anirban Chakraborty
19
13
0
15 Jan 2021
Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation
Gaurav Kumar Nayak
Konda Reddy Mopuri
Anirban Chakraborty
25
18
0
18 Nov 2020
Data-free Knowledge Distillation for Segmentation using Data-Enriching GAN
Kaushal Bhogale
13
3
0
02 Nov 2020
Black-Box Ripper: Copying black-box models using generative evolutionary algorithms
Antonio Bărbălău
Adrian Cosma
Radu Tudor Ionescu
Marius Popescu
MIACV
MLAU
30
43
0
21 Oct 2020
1