ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.16311
86
0

Simultaneous off-the-grid learning of mixtures issued from a continuous dictionary

27 October 2022
C. Butucea
Jean-François Delmas
A. Dutfoy
Clément Hardy
ArXivPDFHTML
Abstract

In this paper we observe a set, possibly a continuum, of signals corrupted by noise. Each signal is a finite mixture of an unknown number of features belonging to a continuous dictionary. The continuous dictionary is parametrized by a real non-linear parameter. We shall assume that the signals share an underlying structure by assuming that each signal has its active features included in a finite and sparse set. We formulate regularized optimization problem to estimate simultaneously the linear coefficients in the mixtures and the non-linear parameters of the features. The optimization problem is composed of a data fidelity term and a (ℓ1,Lp)(\ell_1,L^p)(ℓ1​,Lp)-penalty. We call its solution the Group-Nonlinear-Lasso and provide high probability bounds on the prediction error using certificate functions. Following recent works on the geometry of off-the-grid methods, we show that such functions can be constructed provided the parameters of the active features are pairwise separated by a constant with respect to a Riemannian metric.When the number of signals is finite and the noise is assumed Gaussian, we give refinements of our results for p=1p=1p=1 and p=2p=2p=2 using tail bounds on suprema of Gaussian and χ2\chi^2χ2 random processes. When p=2p=2p=2, our prediction error reaches the rates obtained by the Group-Lasso estimator in the multi-task linear regression model. Furthermore, for p=2p=2p=2 these prediction rates are faster than for p=1p=1p=1 when all signals share most of the non-linear parameters.

View on arXiv
Comments on this paper