ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.19013
  4. Cited By
On Dissipativity of Cross-Entropy Loss in Training ResNets

On Dissipativity of Cross-Entropy Loss in Training ResNets

29 May 2024
Jens Püttschneider
T. Faulwasser
ArXivPDFHTML

Papers citing "On Dissipativity of Cross-Entropy Loss in Training ResNets"

2 / 2 papers shown
Title
Deep Equilibrium Models are Almost Equivalent to Not-so-deep Explicit
  Models for High-dimensional Gaussian Mixtures
Deep Equilibrium Models are Almost Equivalent to Not-so-deep Explicit Models for High-dimensional Gaussian Mixtures
Zenan Ling
Longbo Li
Zhanbo Feng
Yixuan Zhang
Feng Zhou
Robert C. Qiu
Zhenyu Liao
32
4
0
05 Feb 2024
On the Turnpike to Design of Deep Neural Nets: Explicit Depth Bounds
On the Turnpike to Design of Deep Neural Nets: Explicit Depth Bounds
T. Faulwasser
Arne-Jens Hempel
S. Streif
16
5
0
08 Jan 2021
1