ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.09247
16
0

Large Language Models as Particle Swarm Optimizers

12 April 2025
Yamato Shinohara
Jinglue Xu
Tianshui Li
Hitoshi Iba
ArXivPDFHTML
Abstract

Optimization problems often require domain-specific expertise to design problem-dependent methodologies. Recently, several approaches have gained attention by integrating large language models (LLMs) into genetic algorithms. Building on this trend, we introduce Language Model Particle Swarm Optimization (LMPSO), a novel method that incorporates an LLM into the swarm intelligence framework of Particle Swarm Optimization (PSO). In LMPSO, the velocity of each particle is represented as a prompt that generates the next candidate solution, leveraging the capabilities of an LLM to produce solutions in accordance with the PSO paradigm. This integration enables an LLM-driven search process that adheres to the foundational principles of PSO. The proposed LMPSO approach is evaluated across multiple problem domains, including the Traveling Salesman Problem (TSP), heuristic improvement for TSP, and symbolic regression. These problems are traditionally challenging for standard PSO due to the structured nature of their solutions. Experimental results demonstrate that LMPSO is particularly effective for solving problems where solutions are represented as structured sequences, such as mathematical expressions or programmatic constructs. By incorporating LLMs into the PSO framework, LMPSO establishes a new direction in swarm intelligence research. This method not only broadens the applicability of PSO to previously intractable problems but also showcases the potential of LLMs in addressing complex optimization challenges.

View on arXiv
@article{shinohara2025_2504.09247,
  title={ Large Language Models as Particle Swarm Optimizers },
  author={ Yamato Shinohara and Jinglue Xu and Tianshui Li and Hitoshi Iba },
  journal={arXiv preprint arXiv:2504.09247},
  year={ 2025 }
}
Comments on this paper