Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2005.00697
Cited By
DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering
2 May 2020
Qingqing Cao
H. Trivedi
A. Balasubramanian
Niranjan Balasubramanian
Re-assign community
ArXiv
PDF
HTML
Papers citing
"DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering"
9 / 9 papers shown
Title
EAVE: Efficient Product Attribute Value Extraction via Lightweight Sparse-layer Interaction
Li Yang
Qifan Wang
Jianfeng Chi
Jiahao Liu
Jingang Wang
Fuli Feng
Zenglin Xu
Yi Fang
Lifu Huang
Dongfang Liu
28
1
0
10 Jun 2024
Comparing Neighbors Together Makes it Easy: Jointly Comparing Multiple Candidates for Efficient and Effective Retrieval
Jonghyun Song
Cheyon Jin
Wenlong Zhao
Jay Yoon Lee
38
0
0
21 May 2024
Vesper: A Compact and Effective Pretrained Model for Speech Emotion Recognition
Weidong Chen
Xiaofen Xing
Peihao Chen
Xiangmin Xu
VLM
28
35
0
20 Jul 2023
Improving Text Semantic Similarity Modeling through a 3D Siamese Network
Jianxiang Zang
Hui Liu
3DPC
22
2
0
18 Jul 2023
Investigating the Role of Feed-Forward Networks in Transformers Using Parallel Attention and Feed-Forward Net Design
Shashank Sonkar
Richard G. Baraniuk
11
2
0
22 May 2023
Question Generation for Evaluating Cross-Dataset Shifts in Multi-modal Grounding
Arjun Reddy Akula
OOD
23
3
0
24 Jan 2022
VIRT: Improving Representation-based Models for Text Matching through Virtual Interaction
Dan Li
Yang Yang
Hongyin Tang
Jingang Wang
Tong Bill Xu
Wei Yu Wu
Enhong Chen
20
7
0
08 Dec 2021
Optimizing Inference Performance of Transformers on CPUs
D. Dice
Alex Kogan
19
15
0
12 Feb 2021
Learning Dense Representations of Phrases at Scale
Jinhyuk Lee
Mujeen Sung
Jaewoo Kang
Danqi Chen
RALM
DML
NAI
19
115
0
23 Dec 2020
1