ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.13468
15
0

Fast and Low-Memory Deep Neural Networks Using Binary Matrix Factorization

24 October 2022
Alireza Bordbar
M. Kahaei
    MQ
ArXivPDFHTML
Abstract

Despite the outstanding performance of deep neural networks in different applications, they are still computationally extensive and require a great number of memories. This motivates more research on reducing the resources required for implementing such networks. An efficient approach addressed for this purpose is matrix factorization, which has been shown to be effective on different networks. In this paper, we utilize binary matrix factorization and show its great efficiency in reducing the required number of resources in deep neural networks. In effect, this technique can lead to the practical implementation of such networks.

View on arXiv
Comments on this paper