ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2201.11218
  4. Cited By
DNNFuser: Generative Pre-Trained Transformer as a Generalized Mapper for
  Layer Fusion in DNN Accelerators
v1v2 (latest)

DNNFuser: Generative Pre-Trained Transformer as a Generalized Mapper for Layer Fusion in DNN Accelerators

26 January 2022
Sheng-Chun Kao
Xiaoyu Huang
T. Krishna
    AI4CE
ArXiv (abs)PDFHTML

Papers citing "DNNFuser: Generative Pre-Trained Transformer as a Generalized Mapper for Layer Fusion in DNN Accelerators"

3 / 3 papers shown
An approach to optimize inference of the DIART speaker diarization
  pipeline
An approach to optimize inference of the DIART speaker diarization pipeline
Roman Aperdannier
Sigurd Schacht
Alexander Piazza
215
0
0
05 Aug 2024
Improvements in Interlayer Pipelining of CNN Accelerators Using Genetic
  Algorithms
Improvements in Interlayer Pipelining of CNN Accelerators Using Genetic Algorithms
Mark Horeni
Siddharth Joshi
321
0
0
20 Nov 2023
DeFiNES: Enabling Fast Exploration of the Depth-first Scheduling Space
  for DNN Accelerators through Analytical Modeling
DeFiNES: Enabling Fast Exploration of the Depth-first Scheduling Space for DNN Accelerators through Analytical ModelingInternational Symposium on High-Performance Computer Architecture (HPCA), 2022
L. Mei
Koen Goetschalckx
Arne Symons
Marian Verhelst
488
45
0
10 Dec 2022
1
Page 1 of 1