113

Improving Slow Transfer Predictions: Generative Methods Compared

International Conference on Computing, Networking and Communications (ICNC), 2025
Jacob Taegon Kim
Alex Sim
Kesheng Wu
Jinoh Kim
Main:4 Pages
3 Figures
Bibliography:1 Pages
3 Tables
Abstract

Monitoring data transfer performance is a crucial task in scientific computing networks. By predicting performance early in the communication phase, potentially sluggish transfers can be identified and selectively monitored, optimizing network usage and overall performance. A key bottleneck to improving the predictive power of machine learning (ML) models in this context is the issue of class imbalance. This project focuses on addressing the class imbalance problem to enhance the accuracy of performance predictions. In this study, we analyze and compare various augmentation strategies, including traditional oversampling methods and generative techniques. Additionally, we adjust the class imbalance ratios in training datasets to evaluate their impact on model performance. While augmentation may improve performance, as the imbalance ratio increases, the performance does not significantly improve. We conclude that even the most advanced technique, such as CTGAN, does not significantly improve over simple stratified sampling.

View on arXiv
Comments on this paper