ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.23182
  4. Cited By
FSL-SAGE: Accelerating Federated Split Learning via Smashed Activation Gradient Estimation
v1v2 (latest)

FSL-SAGE: Accelerating Federated Split Learning via Smashed Activation Gradient Estimation

29 May 2025
Srijith Nair
Michael Lin
Amirreza Talebi
Peizhong Ju
Elizabeth S. Bentley
Jia Liu
    FedML
ArXiv (abs)PDFHTML

Papers citing "FSL-SAGE: Accelerating Federated Split Learning via Smashed Activation Gradient Estimation"

0 / 0 papers shown

No papers found