ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.12849
112
0

CARLS: Cross-platform Asynchronous Representation Learning System

26 May 2021
Chun-Ta Lu
Yun Zeng
Da-Cheng Juan
Yicheng Fan
Zhe Li
Jan Dlabal
Yi-Ting Chen
Arjun Gopalan
Allan Heydon
Chun-Sung Ferng
Reah Miyara
Ariel Fuxman
Futang Peng
Zhen Li
Tom Duerig
Andrew Tomkins
ArXiv (abs)PDFHTMLGithub (1004★)
Abstract

In this work, we propose CARLS, a novel framework for augmenting the capacity of existing deep learning frameworks by enabling multiple components -- model trainers, knowledge makers and knowledge banks -- to concertedly work together in an asynchronous fashion across hardware platforms. The proposed CARLS is particularly suitable for learning paradigms where model training benefits from additional knowledge inferred or discovered during training, such as node embeddings for graph neural networks or reliable pseudo labels from model predictions. We also describe three learning paradigms -- semi-supervised learning, curriculum learning and multimodal learning -- as examples that can be scaled up efficiently by CARLS. One version of CARLS has been open-sourced and available for download at: https://github.com/tensorflow/neural-structured-learning/tree/master/research/carls

View on arXiv
Comments on this paper