ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.13439
  4. Cited By
Finding Order in Chaos: A Novel Data Augmentation Method for Time Series
  in Contrastive Learning

Finding Order in Chaos: A Novel Data Augmentation Method for Time Series in Contrastive Learning

23 September 2023
B. U. Demirel
Christian Holz
    AI4TS
ArXivPDFHTML

Papers citing "Finding Order in Chaos: A Novel Data Augmentation Method for Time Series in Contrastive Learning"

8 / 8 papers shown
Title
Wearable Accelerometer Foundation Models for Health via Knowledge Distillation
Wearable Accelerometer Foundation Models for Health via Knowledge Distillation
Salar Abbaspourazad
Anshuman Mishra
Joseph D. Futoma
Andrew C. Miller
Ian Shapiro
83
0
0
15 Dec 2024
SimPer: Simple Self-Supervised Learning of Periodic Targets
SimPer: Simple Self-Supervised Learning of Periodic Targets
Yuzhe Yang
Xin Liu
Jiang Wu
Silviu Borac
Dina Katabi
M. Poh
Daniel J. McDuff
41
45
0
06 Oct 2022
SpecMix : A Mixed Sample Data Augmentation method for Training
  withTime-Frequency Domain Features
SpecMix : A Mixed Sample Data Augmentation method for Training withTime-Frequency Domain Features
Gwantae Kim
D. Han
Hanseok Ko
40
42
0
06 Aug 2021
Improving Contrastive Learning by Visualizing Feature Transformation
Improving Contrastive Learning by Visualizing Feature Transformation
Rui Zhu
Bingchen Zhao
Jingen Liu
Zhenglong Sun
C. L. P. Chen
SSL
96
77
0
06 Aug 2021
With a Little Help from My Friends: Nearest-Neighbor Contrastive
  Learning of Visual Representations
With a Little Help from My Friends: Nearest-Neighbor Contrastive Learning of Visual Representations
Debidatta Dwibedi
Y. Aytar
Jonathan Tompson
P. Sermanet
Andrew Zisserman
SSL
183
450
0
29 Apr 2021
Co-Mixup: Saliency Guided Joint Mixup with Supermodular Diversity
Co-Mixup: Saliency Guided Joint Mixup with Supermodular Diversity
Jang-Hyun Kim
Wonho Choo
Hosan Jeong
Hyun Oh Song
195
173
0
05 Feb 2021
LRC-BERT: Latent-representation Contrastive Knowledge Distillation for
  Natural Language Understanding
LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding
Hao Fu
Shaojun Zhou
Qihong Yang
Junjie Tang
Guiquan Liu
Kaikui Liu
Xiaolong Li
25
56
0
14 Dec 2020
MixCo: Mix-up Contrastive Learning for Visual Representation
MixCo: Mix-up Contrastive Learning for Visual Representation
Sungnyun Kim
Gihun Lee
Sangmin Bae
Seyoung Yun
SSL
92
79
0
13 Oct 2020
1