ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.12062
  4. Cited By
A Practical & Unified Notation for Information-Theoretic Quantities in
  ML

A Practical & Unified Notation for Information-Theoretic Quantities in ML

22 June 2021
Andreas Kirsch
Y. Gal
ArXivPDFHTML

Papers citing "A Practical & Unified Notation for Information-Theoretic Quantities in ML"

5 / 5 papers shown
Title
Unifying Approaches in Active Learning and Active Sampling via Fisher
  Information and Information-Theoretic Quantities
Unifying Approaches in Active Learning and Active Sampling via Fisher Information and Information-Theoretic Quantities
Andreas Kirsch
Y. Gal
FedML
22
21
0
01 Aug 2022
Marginal and Joint Cross-Entropies & Predictives for Online Bayesian
  Inference, Active Learning, and Active Sampling
Marginal and Joint Cross-Entropies & Predictives for Online Bayesian Inference, Active Learning, and Active Sampling
Andreas Kirsch
Jannik Kossen
Y. Gal
UQCV
BDL
44
3
0
18 May 2022
A Note on "Assessing Generalization of SGD via Disagreement"
A Note on "Assessing Generalization of SGD via Disagreement"
Andreas Kirsch
Y. Gal
FedML
UQCV
21
15
0
03 Feb 2022
Test Distribution-Aware Active Learning: A Principled Approach Against
  Distribution Shift and Outliers
Test Distribution-Aware Active Learning: A Principled Approach Against Distribution Shift and Outliers
Andreas Kirsch
Tom Rainforth
Y. Gal
OOD
TTA
24
22
0
22 Jun 2021
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
285
9,136
0
06 Jun 2015
1