ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2304.12149
  4. Cited By
Exploring shared memory architectures for end-to-end gigapixel deep
  learning

Exploring shared memory architectures for end-to-end gigapixel deep learning

24 April 2023
Lucas W. Remedios
L. Cai
Samuel W. Remedios
Karthik Ramadass
Aravind Krishnan
Ruining Deng
C. Cui
Shunxing Bao
Lori A. Coburn
Yuankai Huo
Bennett A. Landman
    MedIm
    VLM
ArXivPDFHTML

Papers citing "Exploring shared memory architectures for end-to-end gigapixel deep learning"

1 / 1 papers shown
Title
Distributed Training of Deep Neural Networks: Theoretical and Practical
  Limits of Parallel Scalability
Distributed Training of Deep Neural Networks: Theoretical and Practical Limits of Parallel Scalability
J. Keuper
Franz-Josef Pfreundt
GNN
47
97
0
22 Sep 2016
1