ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2101.08515
  4. Cited By
Pre-training without Natural Images

Pre-training without Natural Images

21 January 2021
Hirokatsu Kataoka
Kazushige Okayasu
Asato Matsumoto
Eisuke Yamagata
Ryosuke Yamada
Nakamasa Inoue
Akio Nakamura
Y. Satoh
ArXivPDFHTML

Papers citing "Pre-training without Natural Images"

13 / 13 papers shown
Title
Formula-Supervised Sound Event Detection: Pre-Training Without Real Data
Formula-Supervised Sound Event Detection: Pre-Training Without Real Data
Yuto Shibata
Keitaro Tanaka
Yoshiaki Bando
Keisuke Imoto
Hirokatsu Kataoka
Yoshimitsu Aoki
26
0
0
06 Apr 2025
RealTraj: Towards Real-World Pedestrian Trajectory Forecasting
RealTraj: Towards Real-World Pedestrian Trajectory Forecasting
Ryo Fujii
Hideo Saito
Ryo Hachiuma
AI4TS
89
1
0
26 Nov 2024
Fractals as Pre-training Datasets for Anomaly Detection and Localization
Fractals as Pre-training Datasets for Anomaly Detection and Localization
C. Ugwu
S. Casarin
O. Lanz
22
0
0
11 May 2024
Learning Universal Predictors
Learning Universal Predictors
Jordi Grau-Moya
Tim Genewein
Marcus Hutter
Laurent Orseau
Grégoire Delétang
...
Anian Ruoss
Wenliang Kevin Li
Christopher Mattern
Matthew Aitchison
J. Veness
19
11
0
26 Jan 2024
Predicting Gradient is Better: Exploring Self-Supervised Learning for
  SAR ATR with a Joint-Embedding Predictive Architecture
Predicting Gradient is Better: Exploring Self-Supervised Learning for SAR ATR with a Joint-Embedding Predictive Architecture
Wei-Jang Li
Yang Wei
Tianpeng Liu
Yuenan Hou
Yuxuan Li
Zhen Liu
Yongxiang Liu
Li Liu
11
17
0
26 Nov 2023
Asynchronous Federated Continual Learning
Asynchronous Federated Continual Learning
Donald Shenaj
Marco Toldo
Alberto Rigon
Pietro Zanuttigh
FedML
CLL
14
34
0
07 Apr 2023
Empirical Evaluation and Theoretical Analysis for Representation
  Learning: A Survey
Empirical Evaluation and Theoretical Analysis for Representation Learning: A Survey
Kento Nozawa
Issei Sato
AI4TS
6
4
0
18 Apr 2022
Towards Adversarial Evaluations for Inexact Machine Unlearning
Towards Adversarial Evaluations for Inexact Machine Unlearning
Shashwat Goel
Ameya Prabhu
Amartya Sanyal
Ser-Nam Lim
Philip H. S. Torr
Ponnurangam Kumaraguru
AAML
ELM
MU
14
44
0
17 Jan 2022
The Augmented Image Prior: Distilling 1000 Classes by Extrapolating from
  a Single Image
The Augmented Image Prior: Distilling 1000 Classes by Extrapolating from a Single Image
Yuki M. Asano
Aaqib Saeed
13
7
0
01 Dec 2021
Beyond Flatland: Pre-training with a Strong 3D Inductive Bias
Beyond Flatland: Pre-training with a Strong 3D Inductive Bias
Shubhaankar Gupta
Thomas P. O'Connell
Bernhard Egger
20
1
0
30 Nov 2021
Boosting Self-Supervised Learning via Knowledge Transfer
Boosting Self-Supervised Learning via Knowledge Transfer
M. Noroozi
Ananth Vinjimoor
Paolo Favaro
Hamed Pirsiavash
SSL
207
282
0
01 May 2018
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,214
0
17 Apr 2017
Aggregated Residual Transformations for Deep Neural Networks
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
261
10,106
0
16 Nov 2016
1