ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1804.06323
  4. Cited By
When and Why are Pre-trained Word Embeddings Useful for Neural Machine
  Translation?

When and Why are Pre-trained Word Embeddings Useful for Neural Machine Translation?

17 April 2018
Ye Qi
Devendra Singh Sachan
Matthieu Felix
Sarguna Padmanabhan
Graham Neubig
ArXivPDFHTML

Papers citing "When and Why are Pre-trained Word Embeddings Useful for Neural Machine Translation?"

3 / 203 papers shown
Title
Google's Neural Machine Translation System: Bridging the Gap between
  Human and Machine Translation
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Z. Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
716
6,724
0
26 Sep 2016
Multi-Way, Multilingual Neural Machine Translation with a Shared
  Attention Mechanism
Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism
Orhan Firat
Kyunghyun Cho
Yoshua Bengio
LRM
AIMat
206
622
0
06 Jan 2016
Convolutional Neural Networks for Sentence Classification
Convolutional Neural Networks for Sentence Classification
Yoon Kim
AILaw
VLM
244
13,283
0
25 Aug 2014
Previous
12345