ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1808.08859
  4. Cited By
Accelerating Asynchronous Stochastic Gradient Descent for Neural Machine
  Translation

Accelerating Asynchronous Stochastic Gradient Descent for Neural Machine Translation

27 August 2018
Nikolay Bogoychev
Marcin Junczys-Dowmunt
Kenneth Heafield
Alham Fikri Aji
    ODL
ArXivPDFHTML

Papers citing "Accelerating Asynchronous Stochastic Gradient Descent for Neural Machine Translation"

3 / 3 papers shown
Title
The University of Edinburgh's Submissions to the WMT19 News Translation
  Task
The University of Edinburgh's Submissions to the WMT19 News Translation Task
Rachel Bawden
Nikolay Bogoychev
Ulrich Germann
Roman Grundkiewicz
Faheem Kirefu
Antonio Valerio Miceli Barone
Alexandra Birch
14
32
0
12 Jul 2019
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
281
2,889
0
15 Sep 2016
Optimal Distributed Online Prediction using Mini-Batches
Optimal Distributed Online Prediction using Mini-Batches
O. Dekel
Ran Gilad-Bachrach
Ohad Shamir
Lin Xiao
171
683
0
07 Dec 2010
1