ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.14218
  4. Cited By
Exploring Fine-tuning Techniques for Pre-trained Cross-lingual Models
  via Continual Learning

Exploring Fine-tuning Techniques for Pre-trained Cross-lingual Models via Continual Learning

29 April 2020
Zihan Liu
Genta Indra Winata
Andrea Madotto
Pascale Fung
    CLL
ArXivPDFHTML

Papers citing "Exploring Fine-tuning Techniques for Pre-trained Cross-lingual Models via Continual Learning"

3 / 3 papers shown
Title
HOP to the Next Tasks and Domains for Continual Learning in NLP
HOP to the Next Tasks and Domains for Continual Learning in NLP
Umberto Michieli
Mete Ozay
VLM
19
2
0
28 Feb 2024
Memory Efficient Continual Learning with Transformers
Memory Efficient Continual Learning with Transformers
B. Ermiş
Giovanni Zappella
Martin Wistuba
Aditya Rawal
Cédric Archambeau
CLL
13
42
0
09 Mar 2022
Word Translation Without Parallel Data
Word Translation Without Parallel Data
Alexis Conneau
Guillaume Lample
MarcÁurelio Ranzato
Ludovic Denoyer
Hervé Jégou
158
1,630
0
11 Oct 2017
1