ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.06395
14
0

A statistical perspective on algorithm unrolling models for inverse problems

10 November 2023
Yves Atchadé
Xinru Liu
Qiuyun Zhu
ArXivPDFHTML
Abstract

We consider inverse problems where the conditional distribution of the observation y{\bf y}y given the latent variable of interest x{\bf x}x (also known as the forward model) is known, and we have access to a data set in which multiple instances of x{\bf x}x and y{\bf y}y are both observed. In this context, algorithm unrolling has become a very popular approach for designing state-of-the-art deep neural network architectures that effectively exploit the forward model. We analyze the statistical complexity of the gradient descent network (GDN), an algorithm unrolling architecture driven by proximal gradient descent. We show that the unrolling depth needed for the optimal statistical performance of GDNs is of order log⁡(n)/log⁡(ϱn−1)\log(n)/\log(\varrho_n^{-1})log(n)/log(ϱn−1​), where nnn is the sample size, and ϱn\varrho_nϱn​ is the convergence rate of the corresponding gradient descent algorithm. We also show that when the negative log-density of the latent variable x{\bf x}x has a simple proximal operator, then a GDN unrolled at depth D′D'D′ can solve the inverse problem at the parametric rate O(D′/n)O(D'/\sqrt{n})O(D′/n​). Our results thus also suggest that algorithm unrolling models are prone to overfitting as the unrolling depth D′D'D′ increases. We provide several examples to illustrate these results.

View on arXiv
Comments on this paper