Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2501.16825
Cited By
Can Transformers Learn Full Bayesian Inference in Context?
28 January 2025
Arik Reuter
Tim G. J. Rudner
Vincent Fortuin
David Rügamer
BDL
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Can Transformers Learn Full Bayesian Inference in Context?"
6 / 6 papers shown
Title
Do-PFN: In-Context Learning for Causal Effect Estimation
Jake Robertson
Arik Reuter
Siyuan Guo
Noah Hollmann
Frank Hutter
Bernhard Schölkopf
CML
63
0
0
06 Jun 2025
Position: The Future of Bayesian Prediction Is Prior-Fitted
Samuel G. Müller
Arik Reuter
Noah Hollmann
David Rügamer
Frank Hutter
28
0
0
29 May 2025
Simplifying Bayesian Optimization Via In-Context Direct Optimum Sampling
Gustavo Sutter Pessurno de Carvalho
Mohammed Abdulrahman
Hao Wang
Sriram Ganapathi Subramanian
Marc St-Aubin
Sharon O'Sullivan
Lawrence Wan
Luis Ricardez-Sandoval
Pascal Poupart
Agustinus Kristiadi
22
0
0
29 May 2025
TabPFN: One Model to Rule Them All?
Qiong Zhang
Yan Shuo Tan
Qinglong Tian
Pengfei Li
36
1
0
26 May 2025
Uncertainty Quantification for Prior-Data Fitted Networks using Martingale Posteriors
Thomas Nagler
David Rügamer
UQCV
100
0
0
16 May 2025
Effortless, Simulation-Efficient Bayesian Inference using Tabular Foundation Models
Julius Vetter
Manuel Gloeckler
Daniel Gedon
Jakob H Macke
117
1
0
24 Apr 2025
1