ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.13689
  4. Cited By
Using Perturbed Length-aware Positional Encoding for Non-autoregressive
  Neural Machine Translation

Using Perturbed Length-aware Positional Encoding for Non-autoregressive Neural Machine Translation

29 July 2021
Yuichi Oka
Katsuhito Sudoh
Satoshi Nakamura
ArXivPDFHTML

Papers citing "Using Perturbed Length-aware Positional Encoding for Non-autoregressive Neural Machine Translation"

2 / 2 papers shown
Title
Sentence-Level or Token-Level? A Comprehensive Study on Knowledge
  Distillation
Sentence-Level or Token-Level? A Comprehensive Study on Knowledge Distillation
Jingxuan Wei
Linzhuang Sun
Yichong Leng
Xu Tan
Bihui Yu
Ruifeng Guo
43
3
0
23 Apr 2024
SHAPE: Shifted Absolute Position Embedding for Transformers
SHAPE: Shifted Absolute Position Embedding for Transformers
Shun Kiyono
Sosuke Kobayashi
Jun Suzuki
Kentaro Inui
231
45
0
13 Sep 2021
1