ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.12184
  4. Cited By
Leveraging Memory Effects and Gradient Information in Consensus-Based
  Optimization: On Global Convergence in Mean-Field Law

Leveraging Memory Effects and Gradient Information in Consensus-Based Optimization: On Global Convergence in Mean-Field Law

22 November 2022
Konstantin Riedl
ArXivPDFHTML

Papers citing "Leveraging Memory Effects and Gradient Information in Consensus-Based Optimization: On Global Convergence in Mean-Field Law"

3 / 3 papers shown
Title
Gradient is All You Need?
Gradient is All You Need?
Konstantin Riedl
T. Klock
Carina Geldhauser
M. Fornasier
19
6
0
16 Jun 2023
FedCBO: Reaching Group Consensus in Clustered Federated Learning through
  Consensus-based Optimization
FedCBO: Reaching Group Consensus in Clustered Federated Learning through Consensus-based Optimization
J. Carrillo
Nicolas García Trillos
Sixu Li
Yuhua Zhu
FedML
29
17
0
04 May 2023
Ensemble-based gradient inference for particle methods in optimization
  and sampling
Ensemble-based gradient inference for particle methods in optimization and sampling
C. Schillings
C. Totzeck
Philipp Wacker
12
9
0
23 Sep 2022
1