ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.01040
  4. Cited By
Hi-Transformer: Hierarchical Interactive Transformer for Efficient and
  Effective Long Document Modeling

Hi-Transformer: Hierarchical Interactive Transformer for Efficient and Effective Long Document Modeling

2 June 2021
Chuhan Wu
Fangzhao Wu
Tao Qi
Yongfeng Huang
ArXivPDFHTML

Papers citing "Hi-Transformer: Hierarchical Interactive Transformer for Efficient and Effective Long Document Modeling"

12 / 12 papers shown
Title
ELITR-Bench: A Meeting Assistant Benchmark for Long-Context Language Models
ELITR-Bench: A Meeting Assistant Benchmark for Long-Context Language Models
Thibaut Thonet
Jos Rozen
Laurent Besacier
RALM
132
2
0
20 Jan 2025
HDT: Hierarchical Document Transformer
HDT: Hierarchical Document Transformer
Haoyu He
Markus Flicke
Jan Buchmann
Iryna Gurevych
Andreas Geiger
35
0
0
11 Jul 2024
DEPTH: Discourse Education through Pre-Training Hierarchically
DEPTH: Discourse Education through Pre-Training Hierarchically
Zachary Bamberger
Ofek Glick
Chaim Baskin
Yonatan Belinkov
59
0
0
13 May 2024
Fovea Transformer: Efficient Long-Context Modeling with Structured
  Fine-to-Coarse Attention
Fovea Transformer: Efficient Long-Context Modeling with Structured Fine-to-Coarse Attention
Ziwei He
Jian Yuan
Le Zhou
Jingwen Leng
Bo Jiang
27
0
0
13 Nov 2023
Attention over pre-trained Sentence Embeddings for Long Document
  Classification
Attention over pre-trained Sentence Embeddings for Long Document Classification
Amine Abdaoui
Sourav Dutta
6
1
0
18 Jul 2023
SpeechFormer++: A Hierarchical Efficient Framework for Paralinguistic
  Speech Processing
SpeechFormer++: A Hierarchical Efficient Framework for Paralinguistic Speech Processing
Weidong Chen
Xiaofen Xing
Xiangmin Xu
Jianxin Pang
Lan Du
30
38
0
27 Feb 2023
A Survey on Natural Language Processing for Programming
A Survey on Natural Language Processing for Programming
Qingfu Zhu
Xianzhen Luo
Fang Liu
Cuiyun Gao
Wanxiang Che
23
1
0
12 Dec 2022
ConReader: Exploring Implicit Relations in Contracts for Contract Clause
  Extraction
ConReader: Exploring Implicit Relations in Contracts for Contract Clause Extraction
Weiwen Xu
Yang Deng
Wenqiang Lei
Wenlong Zhao
Tat-Seng Chua
W. Lam
AILaw
18
6
0
17 Oct 2022
An Exploration of Hierarchical Attention Transformers for Efficient Long
  Document Classification
An Exploration of Hierarchical Attention Transformers for Efficient Long Document Classification
Ilias Chalkidis
Xiang Dai
Manos Fergadiotis
Prodromos Malakasiotis
Desmond Elliott
32
33
0
11 Oct 2022
Fastformer: Additive Attention Can Be All You Need
Fastformer: Additive Attention Can Be All You Need
Chuhan Wu
Fangzhao Wu
Tao Qi
Yongfeng Huang
Xing Xie
35
117
0
20 Aug 2021
A Survey of Transformers
A Survey of Transformers
Tianyang Lin
Yuxin Wang
Xiangyang Liu
Xipeng Qiu
ViT
29
1,086
0
08 Jun 2021
Big Bird: Transformers for Longer Sequences
Big Bird: Transformers for Longer Sequences
Manzil Zaheer
Guru Guruganesh
Kumar Avinava Dubey
Joshua Ainslie
Chris Alberti
...
Philip Pham
Anirudh Ravula
Qifan Wang
Li Yang
Amr Ahmed
VLM
262
2,013
0
28 Jul 2020
1