ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1808.07573
  4. Cited By
Approximation Trees: Statistical Stability in Model Distillation

Approximation Trees: Statistical Stability in Model Distillation

22 August 2018
Yichen Zhou
Zhengze Zhou
Giles Hooker
ArXiv (abs)PDFHTML

Papers citing "Approximation Trees: Statistical Stability in Model Distillation"

6 / 6 papers shown
Title
S-LIME: Stabilized-LIME for Model Explanation
S-LIME: Stabilized-LIME for Model Explanation
Zhengze Zhou
Giles Hooker
Fei Wang
FAtt
157
92
0
15 Jun 2021
How to Evaluate Uncertainty Estimates in Machine Learning for
  Regression?
How to Evaluate Uncertainty Estimates in Machine Learning for Regression?
Laurens Sluijterman
Eric Cator
Tom Heskes
UQCV
75
24
0
07 Jun 2021
HYDRA: Hypergradient Data Relevance Analysis for Interpreting Deep
  Neural Networks
HYDRA: Hypergradient Data Relevance Analysis for Interpreting Deep Neural Networks
Yuanyuan Chen
Boyang Albert Li
Han Yu
Pengcheng Wu
Chunyan Miao
TDI
94
42
0
04 Feb 2021
Distilling Black-Box Travel Mode Choice Model for Behavioral
  Interpretation
Distilling Black-Box Travel Mode Choice Model for Behavioral Interpretation
Xilei Zhao
Zhengze Zhou
X. Yan
Pascal Van Hentenryck
33
3
0
30 Oct 2019
Unbiased Measurement of Feature Importance in Tree-Based Methods
Unbiased Measurement of Feature Importance in Tree-Based Methods
Zhengze Zhou
Giles Hooker
450
65
0
12 Mar 2019
Boosting Random Forests to Reduce Bias; One-Step Boosted Forest and its
  Variance Estimate
Boosting Random Forests to Reduce Bias; One-Step Boosted Forest and its Variance Estimate
Indrayudh Ghosal
Giles Hooker
203
43
0
21 Mar 2018
1