ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.03229
34
3
v1v2 (latest)

Supervised Quantile Normalization for Low-rank Matrix Approximation

8 February 2020
Marco Cuturi
O. Teboul
Jonathan Niles-Weed
Jean-Philippe Vert
ArXiv (abs)PDFHTML
Abstract

Low rank matrix factorization is a fundamental building block in machine learning, used for instance to summarize gene expression profile data or word-document counts. To be robust to outliers and differences in scale across features, a matrix factorization step is usually preceded by ad-hoc feature normalization steps, such as \texttt{tf-idf} scaling or data whitening. We propose in this work to learn these normalization operators jointly with the factorization itself. More precisely, given a d×nd\times nd×n matrix XXX of ddd features measured on nnn individuals, we propose to learn the parameters of quantile normalization operators that can operate row-wise on the values of XXX and/or of its factorization UVUVUV to improve the quality of the low-rank representation of XXX itself. This optimization is facilitated by the introduction of a new differentiable quantile normalization operator built using optimal transport, providing new results on top of existing work by (Cuturi et al. 2019). We demonstrate the applicability of these techniques on synthetic and genomics datasets.

View on arXiv
Comments on this paper