ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.08871
  4. Cited By
Ti-MAE: Self-Supervised Masked Time Series Autoencoders

Ti-MAE: Self-Supervised Masked Time Series Autoencoders

21 January 2023
Zhe Li
Zhongwen Rao
Lujia Pan
Pengyun Wang
Zenglin Xu
    AI4TS
ArXivPDFHTML

Papers citing "Ti-MAE: Self-Supervised Masked Time Series Autoencoders"

5 / 5 papers shown
Title
Towards Generalisable Time Series Understanding Across Domains
Towards Generalisable Time Series Understanding Across Domains
Özgün Turgut
Philip Muller
M. Menten
Daniel Rueckert
AI4TS
36
1
0
09 Oct 2024
A Unified Masked Autoencoder with Patchified Skeletons for Motion
  Synthesis
A Unified Masked Autoencoder with Patchified Skeletons for Motion Synthesis
Esteve Valls Mascaro
Hyemin Ahn
Dongheui Lee
CVBM
29
4
0
14 Aug 2023
CoST: Contrastive Learning of Disentangled Seasonal-Trend
  Representations for Time Series Forecasting
CoST: Contrastive Learning of Disentangled Seasonal-Trend Representations for Time Series Forecasting
Gerald Woo
Chenghao Liu
Doyen Sahoo
Akshat Kumar
Steven C. H. Hoi
AI4TS
111
391
0
03 Feb 2022
Masked Autoencoders Are Scalable Vision Learners
Masked Autoencoders Are Scalable Vision Learners
Kaiming He
Xinlei Chen
Saining Xie
Yanghao Li
Piotr Dollár
Ross B. Girshick
ViT
TPM
258
7,337
0
11 Nov 2021
Informer: Beyond Efficient Transformer for Long Sequence Time-Series
  Forecasting
Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting
Haoyi Zhou
Shanghang Zhang
J. Peng
Shuai Zhang
Jianxin Li
Hui Xiong
Wan Zhang
AI4TS
167
3,799
0
14 Dec 2020
1