Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2203.11239
Cited By
DQ-BART: Efficient Sequence-to-Sequence Model via Joint Distillation and Quantization
21 March 2022
Zheng Li
Zijian Wang
Ming Tan
Ramesh Nallapati
Parminder Bhatia
Andrew O. Arnold
Bing Xiang
Dan Roth
MQ
Re-assign community
ArXiv
PDF
HTML
Papers citing
"DQ-BART: Efficient Sequence-to-Sequence Model via Joint Distillation and Quantization"
7 / 7 papers shown
Title
Expand, Rerank, and Retrieve: Query Reranking for Open-Domain Question Answering
Yung-Sung Chuang
Wei Fang
Shang-Wen Li
Wen-tau Yih
James R. Glass
LRM
22
12
0
26 May 2023
A Systematic Study of Knowledge Distillation for Natural Language Generation with Pseudo-Target Training
Nitay Calderon
Subhabrata Mukherjee
Roi Reichart
Amir Kantor
31
17
0
03 May 2023
A Comprehensive Survey of AI-Generated Content (AIGC): A History of Generative AI from GAN to ChatGPT
Yihan Cao
Siyu Li
Yixin Liu
Zhiling Yan
Yutong Dai
Philip S. Yu
Lichao Sun
29
504
0
07 Mar 2023
Can Open-Domain QA Reader Utilize External Knowledge Efficiently like Humans?
Neeraj Varshney
Man Luo
Chitta Baral
RALM
19
11
0
23 Nov 2022
EdgeFormer: A Parameter-Efficient Transformer for On-Device Seq2seq Generation
Tao Ge
Si-Qing Chen
Furu Wei
MoE
22
21
0
16 Feb 2022
BinaryBERT: Pushing the Limit of BERT Quantization
Haoli Bai
Wei Zhang
Lu Hou
Lifeng Shang
Jing Jin
Xin Jiang
Qun Liu
Michael Lyu
Irwin King
MQ
140
221
0
31 Dec 2020
Temporal Reasoning on Implicit Events from Distant Supervision
Ben Zhou
Kyle Richardson
Qiang Ning
Tushar Khot
Ashish Sabharwal
Dan Roth
157
73
0
24 Oct 2020
1