ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1707.05594
  4. Cited By
On Optimizing Distributed Tucker Decomposition for Dense Tensors

On Optimizing Distributed Tucker Decomposition for Dense Tensors

18 July 2017
Venkatesan T. Chakaravarthy
Jee W. Choi
Douglas J. Joseph
Xing Liu
Prakash Murali
Yogish Sabharwal
D. Sreedhar
ArXiv (abs)PDFHTML

Papers citing "On Optimizing Distributed Tucker Decomposition for Dense Tensors"

3 / 3 papers shown
Title
Deinsum: Practically I/O Optimal Multilinear Algebra
Deinsum: Practically I/O Optimal Multilinear Algebra
A. Ziogas
Grzegorz Kwa'sniewski
Tal Ben-Nun
Timo Schneider
Torsten Hoefler
96
5
0
16 Jun 2022
a-Tucker: Input-Adaptive and Matricization-Free Tucker Decomposition for
  Dense Tensors on CPUs and GPUs
a-Tucker: Input-Adaptive and Matricization-Free Tucker Decomposition for Dense Tensors on CPUs and GPUs
Min Li
Chuanfu Xiao
Chao Yang
15
3
0
20 Oct 2020
On Optimizing Distributed Tucker Decomposition for Sparse Tensors
On Optimizing Distributed Tucker Decomposition for Sparse Tensors
Venkatesan T. Chakaravarthy
Jee W. Choi
Douglas J. Joseph
Prakash Murali
Shivmaran S. Pandian
Yogish Sabharwal
D. Sreedhar
34
26
0
25 Apr 2018
1