ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.06933
  4. Cited By
FDAPT: Federated Domain-adaptive Pre-training for Language Models

FDAPT: Federated Domain-adaptive Pre-training for Language Models

12 July 2023
Lekang Jiang
F. Svoboda
Nicholas D. Lane
    FedML
    AI4CE
ArXivPDFHTML

Papers citing "FDAPT: Federated Domain-adaptive Pre-training for Language Models"

2 / 2 papers shown
Title
Federated Word2Vec: Leveraging Federated Learning to Encourage
  Collaborative Representation Learning
Federated Word2Vec: Leveraging Federated Learning to Encourage Collaborative Representation Learning
Daniel Garcia Bernal
Lodovico Giaretta
Sarunas Girdzijauskas
Magnus Sahlgren
FedML
32
4
0
19 Apr 2021
Efficient Estimation of Word Representations in Vector Space
Efficient Estimation of Word Representations in Vector Space
Tomáš Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
212
29,632
0
16 Jan 2013
1