Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2107.00411
Cited By
Knowledge Distillation for Quality Estimation
1 July 2021
Amit Gajbhiye
M. Fomicheva
Fernando Alva-Manchego
Frédéric Blain
A. Obamuyide
Nikolaos Aletras
Lucia Specia
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation for Quality Estimation"
6 / 6 papers shown
Title
TimeDistill: Efficient Long-Term Time Series Forecasting with MLP via Cross-Architecture Distillation
Juntong Ni
Z. Liu
Shiyu Wang
Ming Jin
Wei-dong Jin
AI4TS
36
1
0
24 Feb 2025
Sentence-Level or Token-Level? A Comprehensive Study on Knowledge Distillation
Jingxuan Wei
Linzhuang Sun
Yichong Leng
Xu Tan
Bihui Yu
Ruifeng Guo
43
3
0
23 Apr 2024
Unraveling Key Factors of Knowledge Distillation
Jingxuan Wei
Linzhuang Sun
Xu Tan
Bihui Yu
Ruifeng Guo
22
0
0
14 Dec 2023
SOLD: Sinhala Offensive Language Dataset
Tharindu Ranasinghe
Isuri Anuradha
Damith Premasiri
Kanishka Silva
Hansi Hettiarachchi
Lasitha Uyangodage
Marcos Zampieri
33
8
0
01 Dec 2022
QUAK: A Synthetic Quality Estimation Dataset for Korean-English Neural Machine Translation
Sugyeong Eo
Chanjun Park
Hyeonseok Moon
Jaehyung Seo
Gyeongmin Kim
Jungseob Lee
Heu-Jeoung Lim
16
1
0
30 Sep 2022
Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles
Balaji Lakshminarayanan
Alexander Pritzel
Charles Blundell
UQCV
BDL
270
5,660
0
05 Dec 2016
1