ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.16275
  4. Cited By
A survey on multi-player bandits

A survey on multi-player bandits

29 November 2022
Etienne Boursier
Vianney Perchet
ArXivPDFHTML

Papers citing "A survey on multi-player bandits"

11 / 11 papers shown
Title
Bandit Max-Min Fair Allocation
Bandit Max-Min Fair Allocation
Tsubasa Harada
Shinji Ito
Hanna Sumita
48
0
0
08 May 2025
QuACK: A Multipurpose Queuing Algorithm for Cooperative $k$-Armed
  Bandits
QuACK: A Multipurpose Queuing Algorithm for Cooperative kkk-Armed Bandits
Benjamin Howson
Sarah Filippi
Ciara Pike-Burke
34
1
0
31 Oct 2024
Learning to Mitigate Externalities: the Coase Theorem with Hindsight Rationality
Learning to Mitigate Externalities: the Coase Theorem with Hindsight Rationality
Antoine Scheid
Aymeric Capitaine
Etienne Boursier
Eric Moulines
Michael I. Jordan
Alain Durmus
31
2
0
28 Jun 2024
Multi-Player Approaches for Dueling Bandits
Multi-Player Approaches for Dueling Bandits
Or Raveh
Junya Honda
Masashi Sugiyama
34
1
0
25 May 2024
PPA-Game: Characterizing and Learning Competitive Dynamics Among Online
  Content Creators
PPA-Game: Characterizing and Learning Competitive Dynamics Among Online Content Creators
Renzhe Xu
Haotian Wang
Xingxuan Zhang
Bo-wen Li
Peng Cui
24
3
0
22 Mar 2024
Incentivized Learning in Principal-Agent Bandit Games
Incentivized Learning in Principal-Agent Bandit Games
Antoine Scheid
D. Tiapkin
Etienne Boursier
Aymeric Capitaine
El-Mahdi El-Mhamdi
Eric Moulines
Michael I. Jordan
Alain Durmus
35
6
0
06 Mar 2024
GROS: A General Robust Aggregation Strategy
GROS: A General Robust Aggregation Strategy
A. Cholaquidis
Emilien Joly
L. Moreno
30
2
0
23 Feb 2024
Constant or logarithmic regret in asynchronous multiplayer bandits
Constant or logarithmic regret in asynchronous multiplayer bandits
Hugo Richard
Etienne Boursier
Vianney Perchet
28
1
0
31 May 2023
Competing for Shareable Arms in Multi-Player Multi-Armed Bandits
Competing for Shareable Arms in Multi-Player Multi-Armed Bandits
Renzhe Xu
Hongya Wang
Xingxuan Zhang
B. Li
Peng Cui
18
5
0
30 May 2023
An Instance-Dependent Analysis for the Cooperative Multi-Player
  Multi-Armed Bandit
An Instance-Dependent Analysis for the Cooperative Multi-Player Multi-Armed Bandit
Aldo Pacchiano
Peter L. Bartlett
Michael I. Jordan
24
5
0
08 Nov 2021
Be Greedy in Multi-Armed Bandits
Be Greedy in Multi-Armed Bandits
Matthieu Jedor
Jonathan Louëdec
Vianney Perchet
28
8
0
04 Jan 2021
1