ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.07636
  4. Cited By
Go beyond End-to-End Training: Boosting Greedy Local Learning with
  Context Supply

Go beyond End-to-End Training: Boosting Greedy Local Learning with Context Supply

12 December 2023
Chengting Yu
Fengzhao Zhang
Hanzhi Ma
Aili Wang
Er-ping Li
ArXivPDFHTML

Papers citing "Go beyond End-to-End Training: Boosting Greedy Local Learning with Context Supply"

3 / 3 papers shown
Title
NeuLite: Memory-Efficient Federated Learning via Elastic Progressive
  Training
NeuLite: Memory-Efficient Federated Learning via Elastic Progressive Training
Yebo Wu
Li Li
Chunlin Tian
Dubing Chen
Chengzhong Xu
FedML
19
3
0
20 Aug 2024
Training Deep Architectures Without End-to-End Backpropagation: A Survey
  on the Provably Optimal Methods
Training Deep Architectures Without End-to-End Backpropagation: A Survey on the Provably Optimal Methods
Shiyu Duan
José C. Príncipe
MQ
20
3
0
09 Jan 2021
Densely Connected Convolutional Networks
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
L. V. D. van der Maaten
Kilian Q. Weinberger
PINN
3DV
255
36,362
0
25 Aug 2016
1