ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.14806
  4. Cited By
A Contrastive Learning Scheme with Transformer Innate Patches

A Contrastive Learning Scheme with Transformer Innate Patches

26 March 2023
S. Jyhne
Per-Arne Andersen
Morten Goodwin Olsen
    ViT
ArXivPDFHTML

Papers citing "A Contrastive Learning Scheme with Transformer Innate Patches"

4 / 4 papers shown
Title
On The Computational Complexity of Self-Attention
On The Computational Complexity of Self-Attention
Feyza Duman Keles
Pruthuvi Maheshakya Wijewardena
C. Hegde
63
107
0
11 Sep 2022
UniFormer: Unifying Convolution and Self-attention for Visual
  Recognition
UniFormer: Unifying Convolution and Self-attention for Visual Recognition
Kunchang Li
Yali Wang
Junhao Zhang
Peng Gao
Guanglu Song
Yu Liu
Hongsheng Li
Yu Qiao
ViT
142
360
0
24 Jan 2022
Semi-Supervised Semantic Segmentation with Pixel-Level Contrastive
  Learning from a Class-wise Memory Bank
Semi-Supervised Semantic Segmentation with Pixel-Level Contrastive Learning from a Class-wise Memory Bank
Inigo Alonso
Alberto Sabater
David Ferstl
Luis Montesano
Ana C. Murillo
SSL
CLL
119
202
0
27 Apr 2021
A Novel Transformer Based Semantic Segmentation Scheme for
  Fine-Resolution Remote Sensing Images
A Novel Transformer Based Semantic Segmentation Scheme for Fine-Resolution Remote Sensing Images
Libo Wang
Rui Li
Chenxi Duan
Ce Zhang
Xiaoliang Meng
Shenghui Fang
ViT
110
240
0
25 Apr 2021
1