ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.03712
  4. Cited By
MHSAN: Multi-Head Self-Attention Network for Visual Semantic Embedding

MHSAN: Multi-Head Self-Attention Network for Visual Semantic Embedding

IEEE Workshop/Winter Conference on Applications of Computer Vision (WACV), 2020
11 January 2020
Geondo Park
Chihye Han
Wonjun Yoon
Dae-Shik Kim
ArXiv (abs)PDFHTML

Papers citing "MHSAN: Multi-Head Self-Attention Network for Visual Semantic Embedding"

4 / 4 papers shown
Think Parallax: Solving Multi-Hop Problems via Multi-View Knowledge-Graph-Based Retrieval-Augmented Generation
Think Parallax: Solving Multi-Hop Problems via Multi-View Knowledge-Graph-Based Retrieval-Augmented Generation
Jinliang Liu
Jiale Bai
Shaoning Zeng
LRM
185
0
0
17 Oct 2025
Object Affordance Recognition and Grounding via Multi-scale Cross-modal Representation Learning
Object Affordance Recognition and Grounding via Multi-scale Cross-modal Representation Learning
Xinhang Wan
Dongqiang Gou
Xinwang Liu
En Zhu
Xuming He
192
1
0
02 Aug 2025
Efficient Image-Text Retrieval via Keyword-Guided Pre-Screening
Efficient Image-Text Retrieval via Keyword-Guided Pre-Screening
Min Cao
Yang Bai
Wenwen Qiang
Ziqiang Cao
Liqiang Nie
Min Zhang
247
4
0
14 Mar 2023
Paying More Attention to Self-attention: Improving Pre-trained Language
  Models via Attention Guiding
Paying More Attention to Self-attention: Improving Pre-trained Language Models via Attention Guiding
Shanshan Wang
Zhumin Chen
Zhaochun Ren
Huasheng Liang
Qiang Yan
Sudipta Singha Roy
154
10
0
06 Apr 2022
1
Page 1 of 1