ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.06872
16
24

Tensor networks and efficient descriptions of classical data

11 March 2021
Sirui Lu
Márton Kanász-Nagy
I. Kukuljan
J. I. Cirac
ArXivPDFHTML
Abstract

We investigate the potential of tensor network based machine learning methods to scale to large image and text data sets. For that, we study how the mutual information between a subregion and its complement scales with the subsystem size LLL, similarly to how it is done in quantum many-body physics. We find that for text, the mutual information scales as a power law LνL^\nuLν with a close to volume law exponent, indicating that text cannot be efficiently described by 1D tensor networks. For images, the scaling is close to an area law, hinting at 2D tensor networks such as PEPS could have an adequate expressibility. For the numerical analysis, we introduce a mutual information estimator based on autoregressive networks, and we also use convolutional neural networks in a neural estimator method.

View on arXiv
Comments on this paper