ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.04168
20
12
v1v2 (latest)

Incremental Recursive Ranking Grouping for Large Scale Global Optimization

8 June 2022
M. Komarnicki
M. Przewozniczek
H. Kwasnicka
ArXiv (abs)PDFHTML
Abstract

Real-world optimization problems may have a different underlying structure. In black-box optimization, the dependencies between decision variables remain unknown. However, some techniques can discover such interactions accurately. In Large Scale Global Optimization (LSGO), problems are high-dimensional. It was shown effective to decompose LSGO problems into subproblems and optimize them separately. The effectiveness of such approaches may be highly dependent on the accuracy of problem decomposition. Many state-of-the-art decomposition strategies are derived from Differential Grouping (DG). However, if a given problem consists of non-additively separable subproblems, their ability to detect only true interactions might decrease significantly. Therefore, we propose Incremental Recursive Ranking Grouping (IRRG) that does not suffer from this flaw. IRRG consumes more fitness function evaluations than the recent DG-based propositions, e.g., Recursive DG 3 (RDG3). Nevertheless, the effectiveness of the considered Cooperative Co-evolution frameworks after embedding IRRG or RDG3 was similar for problems with additively separable subproblems that are suitable for RDG3. However, after replacing the additive separability with non-additive, embedding IRRG leads to results of significantly higher quality.

View on arXiv
Comments on this paper