ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2209.03316
  4. Cited By
On the Complementarity between Pre-Training and Random-Initialization
  for Resource-Rich Machine Translation

On the Complementarity between Pre-Training and Random-Initialization for Resource-Rich Machine Translation

7 September 2022
Changtong Zan
Liang Ding
Li Shen
Yu Cao
Weifeng Liu
Dacheng Tao
ArXivPDFHTML

Papers citing "On the Complementarity between Pre-Training and Random-Initialization for Resource-Rich Machine Translation"

2 / 2 papers shown
Title
Diversifying the Mixture-of-Experts Representation for Language Models
  with Orthogonal Optimizer
Diversifying the Mixture-of-Experts Representation for Language Models with Orthogonal Optimizer
Boan Liu
Liang Ding
Li Shen
Keqin Peng
Yu Cao
Dazhao Cheng
Dacheng Tao
MoE
36
7
0
15 Oct 2023
A Contrastive Cross-Channel Data Augmentation Framework for Aspect-based
  Sentiment Analysis
A Contrastive Cross-Channel Data Augmentation Framework for Aspect-based Sentiment Analysis
Bing Wang
Liang Ding
Qihuang Zhong
Ximing Li
Dacheng Tao
27
32
0
16 Apr 2022
1