ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2501.13467
  4. Cited By
Multi-Level Attention and Contrastive Learning for Enhanced Text Classification with an Optimized Transformer

Multi-Level Attention and Contrastive Learning for Enhanced Text Classification with an Optimized Transformer

23 January 2025
Jia Gao
Guiran Liu
Binrong Zhu
Shicheng Zhou
Hongye Zheng
Xiaoxuan Liao
ArXivPDFHTML

Papers citing "Multi-Level Attention and Contrastive Learning for Enhanced Text Classification with an Optimized Transformer"

3 / 3 papers shown
Title
Adaptive Transformer Attention and Multi-Scale Fusion for Spine 3D Segmentation
Adaptive Transformer Attention and Multi-Scale Fusion for Spine 3D Segmentation
Yanlin Xiang
Qingyuan He
Ting Xu
Ran Hao
Jiacheng Hu
Hanchao Zhang
MedIm
53
1
0
17 Mar 2025
Context-Aware Rule Mining Using a Dynamic Transformer-Based Framework
Jie Liu
Yiwei Zhang
Yuan Sheng
Yujia Lou
Haige Wang
Bohuan Yang
51
4
0
14 Mar 2025
Optimized Unet with Attention Mechanism for Multi-Scale Semantic Segmentation
Optimized Unet with Attention Mechanism for Multi-Scale Semantic Segmentation
Xuan Li
Quanchao Lu
Yankaiqi Li
Muqing Li
Yijiashun Qi
SSeg
96
8
0
06 Feb 2025
1