ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.18593
  4. Cited By
Sustainable Supercomputing for AI: GPU Power Capping at HPC Scale

Sustainable Supercomputing for AI: GPU Power Capping at HPC Scale

25 February 2024
Dan Zhao
S. Samsi
Joseph McDonald
Baolin Li
David Bestor
Michael Jones
Devesh Tiwari
V. Gadepally
ArXivPDFHTML

Papers citing "Sustainable Supercomputing for AI: GPU Power Capping at HPC Scale"

4 / 4 papers shown
Title
Sustainable Computing -- Without the Hot Air
Sustainable Computing -- Without the Hot Air
Noman Bashir
David E. Irwin
Prashant J. Shenoy
Abel Souza
23
14
0
30 Jun 2022
Great Power, Great Responsibility: Recommendations for Reducing Energy
  for Training Language Models
Great Power, Great Responsibility: Recommendations for Reducing Energy for Training Language Models
Joseph McDonald
Baolin Li
Nathan C. Frey
Devesh Tiwari
V. Gadepally
S. Samsi
29
44
0
19 May 2022
Sparsity in Deep Learning: Pruning and growth for efficient inference
  and training in neural networks
Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks
Torsten Hoefler
Dan Alistarh
Tal Ben-Nun
Nikoli Dryden
Alexandra Peste
MQ
141
684
0
31 Jan 2021
Megatron-LM: Training Multi-Billion Parameter Language Models Using
  Model Parallelism
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
245
1,817
0
17 Sep 2019
1