ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.11518
  4. Cited By
Hybrid Attention-Based Transformer Block Model for Distant Supervision
  Relation Extraction
v1v2 (latest)

Hybrid Attention-Based Transformer Block Model for Distant Supervision Relation Extraction

Neurocomputing (Neurocomputing), 2020
10 March 2020
Yan Xiao
Yaochu Jin
Ran Cheng
K. Hao
ArXiv (abs)PDFHTML

Papers citing "Hybrid Attention-Based Transformer Block Model for Distant Supervision Relation Extraction"

2 / 2 papers shown
Adaptive Prototypical Networks with Label Words and Joint Representation
  Learning for Few-Shot Relation Classification
Adaptive Prototypical Networks with Label Words and Joint Representation Learning for Few-Shot Relation ClassificationIEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2021
Yan Xiao
Yaochu Jin
K. Hao
211
58
0
10 Jan 2021
Finding Influential Instances for Distantly Supervised Relation
  Extraction
Finding Influential Instances for Distantly Supervised Relation ExtractionInternational Conference on Computational Linguistics (COLING), 2020
Zifeng Wang
Rui Wen
Xi Chen
Shao-Lun Huang
Ningyu Zhang
Yefeng Zheng
TDI
182
26
0
17 Sep 2020
1
Page 1 of 1