ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.06675
  4. Cited By
LUKE-Graph: A Transformer-based Approach with Gated Relational Graph
  Attention for Cloze-style Reading Comprehension

LUKE-Graph: A Transformer-based Approach with Gated Relational Graph Attention for Cloze-style Reading Comprehension

Neurocomputing (Neurocomputing), 2023
12 March 2023
Shima Foolad
Kourosh Kiani
ArXiv (abs)PDFHTML

Papers citing "LUKE-Graph: A Transformer-based Approach with Gated Relational Graph Attention for Cloze-style Reading Comprehension"

3 / 3 papers shown
Title
Trustful LLMs: Customizing and Grounding Text Generation with Knowledge
  Bases and Dual Decoders
Trustful LLMs: Customizing and Grounding Text Generation with Knowledge Bases and Dual Decoders
Xiaofeng Zhu
Jaya Krishna Mandivarapu
HILM
640
0
0
12 Nov 2024
Recent Advances in Multi-Choice Machine Reading Comprehension: A Survey
  on Methods and Datasets
Recent Advances in Multi-Choice Machine Reading Comprehension: A Survey on Methods and Datasets
Shima Foolad
Kourosh Kiani
R. Rastgoo
FaML
250
0
0
04 Aug 2024
Integrating a Heterogeneous Graph with Entity-aware Self-attention using
  Relative Position Labels for Reading Comprehension Model
Integrating a Heterogeneous Graph with Entity-aware Self-attention using Relative Position Labels for Reading Comprehension Model
Shima Foolad
Kourosh Kiani
229
1
0
19 Jul 2023
1