ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2304.14065
  4. Cited By
Lightweight, Pre-trained Transformers for Remote Sensing Timeseries

Lightweight, Pre-trained Transformers for Remote Sensing Timeseries

27 April 2023
Gabriel Tseng
Ruben Cartuyvels
Ivan Zvonkov
Mirali Purohit
David Rolnick
Hannah Kerner
ArXivPDFHTML

Papers citing "Lightweight, Pre-trained Transformers for Remote Sensing Timeseries"

3 / 3 papers shown
Title
Scale-MAE: A Scale-Aware Masked Autoencoder for Multiscale Geospatial
  Representation Learning
Scale-MAE: A Scale-Aware Masked Autoencoder for Multiscale Geospatial Representation Learning
Colorado Reed
Ritwik Gupta
Shufan Li
S. Brockman
Christopher Funk
Brian Clipp
Kurt Keutzer
Salvatore Candido
M. Uyttendaele
Trevor Darrell
99
93
0
30 Dec 2022
Masked Autoencoders Are Scalable Vision Learners
Masked Autoencoders Are Scalable Vision Learners
Kaiming He
Xinlei Chen
Saining Xie
Yanghao Li
Piotr Dollár
Ross B. Girshick
ViT
TPM
255
5,353
0
11 Nov 2021
A Generalizable and Accessible Approach to Machine Learning with Global
  Satellite Imagery
A Generalizable and Accessible Approach to Machine Learning with Global Satellite Imagery
Esther Rolf
J. Proctor
T. Carleton
I. Bolliger
Vaishaal Shankar
Miyabi Ishihara
Benjamin Recht
S. Hsiang
71
79
0
16 Oct 2020
1