ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 0704.0671
  4. Cited By
Learning from compressed observations

Learning from compressed observations

5 April 2007
Maxim Raginsky
ArXiv (abs)PDFHTML

Papers citing "Learning from compressed observations"

6 / 6 papers shown
Project Alexandria: Towards Freeing Scientific Knowledge from Copyright Burdens via LLMs
Project Alexandria: Towards Freeing Scientific Knowledge from Copyright Burdens via LLMs
Christoph Schuhmann
Gollam Rabby
Christian Schroeder de Witt
Tawsif Ahmed
Andreas Hochlehnert
...
Ludwig Schmidt
R. Kaczmarczyk
Sören Auer
J. Jitsev
Matthias Bethge
883
0
0
26 Feb 2025
Learning of Tree-Structured Gaussian Graphical Models on Distributed
  Data under Communication Constraints
Learning of Tree-Structured Gaussian Graphical Models on Distributed Data under Communication Constraints
Mostafa Tavassolipour
S. Motahari
M. Manzuri-Shalmani
275
9
0
21 Sep 2018
Rate-Distortion Bounds on Bayes Risk in Supervised Learning
Rate-Distortion Bounds on Bayes Risk in Supervised Learning
M. Nokleby
Ahmad Beirami
Robert Calderbank
326
10
0
08 May 2016
Are Slepian-Wolf Rates Necessary for Distributed Parameter Estimation?
Are Slepian-Wolf Rates Necessary for Distributed Parameter Estimation?Allerton Conference on Communication, Control, and Computing (Allerton), 2015
M. Gamal
Lifeng Lai
163
14
0
11 Aug 2015
Quantized Nonparametric Estimation over Sobolev Ellipsoids
Quantized Nonparametric Estimation over Sobolev Ellipsoids
Yuancheng Zhu
John D. Lafferty
322
2
0
25 Mar 2015
Achievability results for statistical learning under communication
  constraints
Achievability results for statistical learning under communication constraints
Maxim Raginsky
288
2
0
13 Jan 2009
1
Page 1 of 1