ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.08335
  4. Cited By
Demystifying BERT: Implications for Accelerator Design

Demystifying BERT: Implications for Accelerator Design

14 April 2021
Suchita Pati
Shaizeen Aga
Nuwan Jayasena
Matthew D. Sinclair
    LLMAG
ArXivPDFHTML

Papers citing "Demystifying BERT: Implications for Accelerator Design"

3 / 3 papers shown
Title
A Heterogeneous Chiplet Architecture for Accelerating End-to-End
  Transformer Models
A Heterogeneous Chiplet Architecture for Accelerating End-to-End Transformer Models
Harsh Sharma
Pratyush Dhingra
J. Doppa
Ümit Y. Ogras
P. Pande
32
7
0
18 Dec 2023
SynCron: Efficient Synchronization Support for Near-Data-Processing
  Architectures
SynCron: Efficient Synchronization Support for Near-Data-Processing Architectures
Christina Giannoula
Nandita Vijaykumar
Nikela Papadopoulou
Vasileios Karakostas
Ivan Fernandez
Juan Gómez Luna
Lois Orosa
N. Koziris
G. Goumas
O. Mutlu
85
82
0
19 Jan 2021
Megatron-LM: Training Multi-Billion Parameter Language Models Using
  Model Parallelism
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
245
1,817
0
17 Sep 2019
1