Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2203.10579
Cited By
Small Batch Sizes Improve Training of Low-Resource Neural MT
20 March 2022
Àlex R. Atrio
Andrei Popescu-Belis
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Small Batch Sizes Improve Training of Low-Resource Neural MT"
4 / 4 papers shown
Title
Improving Transformer Performance for French Clinical Notes Classification Using Mixture of Experts on a Limited Dataset
Thanh-Dung Le
P. Jouvet
R. Noumeir
MoE
MedIm
57
5
0
22 Mar 2023
Too Brittle To Touch: Comparing the Stability of Quantization and Distillation Towards Developing Lightweight Low-Resource MT Models
Harshita Diddee
Sandipan Dandapat
Monojit Choudhury
T. Ganu
Kalika Bali
27
5
0
27 Oct 2022
OpenNMT: Neural Machine Translation Toolkit
Guillaume Klein
Yoon Kim
Yuntian Deng
Vincent Nguyen
Jean Senellart
Alexander M. Rush
144
119
0
28 May 2018
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,878
0
15 Sep 2016
1