Title |
|---|
| Name | # Papers | # Citations |
|---|---|---|
| Date | Location | Event |
|---|---|---|
Training data inference quantifies the impact of training samples on a model's predictions. The community focuses on efficient means to estimate the influence of individual or groups of training samples.
Title |
|---|
Title | |||
|---|---|---|---|
![]() Balanced contributions, consistency, and value for games with externalities André Casajus Yukihiko Funaki Frank Huettner | |||
![]() Not All Instances Are Equally Valuable: Towards Influence-Weighted Dataset Distillation Qiyan Deng Changqian Zheng Lianpeng Qiao Yuping Wang Chengliang Chai Lei Cao | |||
![]() Un-Attributability: Computing Novelty From Retrieval & Semantic Similarity Philipp Davydov Ameya Prabhu Matthias Bethge Elisa Nguyen Seong Joon Oh | |||
![]() Newfluence: Boosting Model interpretability and Understanding in High Dimensions Haolin Zou Arnab Auddy Yongchan Kwon Kamiar Rahnama Rad Arian Maleki | |||
![]() Attributing Data for Sharpness-Aware Minimization Chenyang Ren Yifan Jia Huanyi Xie Zhaobin Xu Tianxing Wei Liangyu Wang Lijie Hu Di Wang | |||
![]() Counterfactual Explanation of Shapley Value in Data Coalitions Michelle Si Jian Pei | |||
![]() Counterfactual Influence as a Distributional Quantity Matthieu Meeus Igor Shilov Georgios Kaissis Yves-Alexandre de Montjoye | |||
| Name (-) |
|---|
| Name (-) |
|---|
| Name (-) |
|---|
| Date | Location | Event | |
|---|---|---|---|
| No social events available | |||