55
1

Boosting Generalization in Diffusion-Based Neural Combinatorial Solver via Energy-guided Sampling

Abstract

Diffusion-based Neural Combinatorial Optimization (NCO) has demonstrated effectiveness in solving NP-complete (NPC) problems by learning discrete diffusion models for solution generation, eliminating hand-crafted domain knowledge. Despite their success, existing NCO methods face significant challenges in both cross-scale and cross-problem generalization, and high training costs compared to traditional solvers. While recent studies have introduced training-free guidance approaches that leverage pre-defined guidance functions for zero-shot conditional generation, such methodologies have not been extensively explored in combinatorial optimization. To bridge this gap, we propose a general energy-guided sampling framework during inference time that enhances both the cross-scale and cross-problem generalization capabilities of diffusion-based NCO solvers without requiring additional training. We provide theoretical analysis that helps understanding the cross-problem transfer capability. Our experimental results demonstrate that a diffusion solver, trained exclusively on the Traveling Salesman Problem (TSP), can achieve competitive zero-shot solution generation on TSP variants, such as Prize Collecting TSP (PCTSP) and the Orienteering Problem (OP), through energy-guided sampling across different problem scales.

View on arXiv
@article{lei2025_2502.12188,
  title={ Boosting Generalization in Diffusion-Based Neural Combinatorial Solver via Energy-guided Sampling },
  author={ Haoyu Lei and Kaiwen Zhou and Yinchuan Li and Zhitang Chen and Farzan Farnia },
  journal={arXiv preprint arXiv:2502.12188},
  year={ 2025 }
}
Comments on this paper