141

Reducing Instability in Synthetic Data Evaluation with a Super-Metric in MalDataGen

Main:2 Pages
4 Figures
Bibliography:3 Pages
Abstract

Evaluating the quality of synthetic data remains a persistent challenge in the Android malware domain due to instability and the lack of standardization among existing metrics. This work integrates into MalDataGen a Super-Metric that aggregates eight metrics across four fidelity dimensions, producing a single weighted score. Experiments involving ten generative models and five balanced datasets demonstrate that the Super-Metric is more stable and consistent than traditional metrics, exhibiting stronger correlations with the actual performance of classifiers.

View on arXiv
Comments on this paper