ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.06037
  4. Cited By
On the Maximum Mutual Information Capacity of Neural Architectures

On the Maximum Mutual Information Capacity of Neural Architectures

10 June 2020
Brandon Foggo
Nan Yu
    TPM
ArXivPDFHTML

Papers citing "On the Maximum Mutual Information Capacity of Neural Architectures"

1 / 1 papers shown
Title
Towards quantifying information flows: relative entropy in deep neural
  networks and the renormalization group
Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group
J. Erdmenger
Kevin T. Grosvenor
R. Jefferson
54
17
0
14 Jul 2021
1