ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.12034
  4. Cited By
Pure Exploration in Kernel and Neural Bandits
v1v2 (latest)

Pure Exploration in Kernel and Neural Bandits

22 June 2021
Yinglun Zhu
Dongruo Zhou
Ruoxi Jiang
Quanquan Gu
Rebecca Willett
Robert D. Nowak
ArXiv (abs)PDFHTML

Papers citing "Pure Exploration in Kernel and Neural Bandits"

12 / 12 papers shown
Title
Online Clustering of Dueling Bandits
Online Clustering of Dueling Bandits
Zhiyong Wang
Jiahang Sun
Mingze Kong
Jize Xie
Qinghua Hu
J. C. Lui
Zhongxiang Dai
116
0
0
04 Feb 2025
Optimal Design for Human Feedback
Optimal Design for Human Feedback
Subhojyoti Mukherjee
Anusha Lalitha
Kousha Kalantari
Aniket Deshmukh
Ge Liu
Yifei Ma
Branislav Kveton
72
0
0
22 Apr 2024
Efficient Prompt Optimization Through the Lens of Best Arm
  Identification
Efficient Prompt Optimization Through the Lens of Best Arm Identification
Chengshuai Shi
Kun Yang
Zihan Chen
Jundong Li
Jing Yang
Cong Shen
81
10
0
15 Feb 2024
Multi-task Representation Learning for Pure Exploration in Bilinear
  Bandits
Multi-task Representation Learning for Pure Exploration in Bilinear Bandits
Subhojyoti Mukherjee
Qiaomin Xie
Josiah P. Hanna
Robert D. Nowak
121
6
0
01 Nov 2023
Pure Exploration in Asynchronous Federated Bandits
Pure Exploration in Asynchronous Federated Bandits
Zichen Wang
Chuanhao Li
Chenyu Song
Lianghui Wang
Quanquan Gu
Huazheng Wang
FedML
64
3
0
17 Oct 2023
Kernel $ε$-Greedy for Multi-Armed Bandits with Covariates
Kernel εεε-Greedy for Multi-Armed Bandits with Covariates
Sakshi Arya
Bharath K. Sriperumbudur
123
0
0
29 Jun 2023
(Private) Kernelized Bandits with Distributed Biased Feedback
(Private) Kernelized Bandits with Distributed Biased Feedback
Fengjiao Li
Xingyu Zhou
Bo Ji
107
5
0
28 Jan 2023
Neural Bandits for Data Mining: Searching for Dangerous Polypharmacy
Neural Bandits for Data Mining: Searching for Dangerous Polypharmacy
Alexandre Larouche
Audrey Durand
Richard Khoury
C. Sirois
16
0
0
10 Dec 2022
Federated Neural Bandits
Federated Neural Bandits
Zhongxiang Dai
Yao Shu
Arun Verma
Flint Xiaofeng Fan
Bryan Kian Hsiang Low
Patrick Jaillet
FedML
117
15
0
28 May 2022
Reward-Biased Maximum Likelihood Estimation for Neural Contextual
  Bandits
Reward-Biased Maximum Likelihood Estimation for Neural Contextual Bandits
Yu-Heng Hung
Ping-Chun Hsieh
63
2
0
08 Mar 2022
Collaborative Pure Exploration in Kernel Bandit
Collaborative Pure Exploration in Kernel Bandit
Yihan Du
Wei Chen
Yuko Kuroki
Longbo Huang
104
12
0
29 Oct 2021
Near Instance Optimal Model Selection for Pure Exploration Linear
  Bandits
Near Instance Optimal Model Selection for Pure Exploration Linear Bandits
Yinglun Zhu
Julian Katz-Samuels
Robert D. Nowak
55
7
0
10 Sep 2021
1