Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2405.01147
Cited By
Why Tabular Foundation Models Should Be a Research Priority
2 May 2024
B. V. Breugel
M. Schaar
LMTD
VLM
AI4CE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Why Tabular Foundation Models Should Be a Research Priority"
8 / 8 papers shown
Title
Generative Calibration for In-context Learning
Zhongtao Jiang
Yuanzhe Zhang
Cao Liu
Jun Zhao
Kang Liu
147
8
0
16 Oct 2023
From Supervised to Generative: A Novel Paradigm for Tabular Deep Learning with Large Language Models
Xumeng Wen
Han Zhang
Shun Zheng
Wei Xu
Jiang Bian
LMTD
ALM
62
20
0
11 Oct 2023
Multimodal Foundation Models: From Specialists to General-Purpose Assistants
Chunyuan Li
Zhe Gan
Zhengyuan Yang
Jianwei Yang
Linjie Li
Lijuan Wang
Jianfeng Gao
MLLM
107
221
0
18 Sep 2023
Chain-of-Knowledge: Grounding Large Language Models via Dynamic Knowledge Adapting over Heterogeneous Sources
Xingxuan Li
Ruochen Zhao
Yew Ken Chia
Bosheng Ding
Shafiq R. Joty
Soujanya Poria
Lidong Bing
HILM
BDL
LRM
77
85
0
22 May 2023
How Faithful is your Synthetic Data? Sample-level Metrics for Evaluating and Auditing Generative Models
Ahmed Alaa
B. V. Breugel
Evgeny S. Saveliev
M. Schaar
36
135
0
17 Feb 2021
TabTransformer: Tabular Data Modeling Using Contextual Embeddings
Xin Huang
A. Khetan
Milan Cvitkovic
Zohar S. Karnin
ViT
LMTD
138
412
0
11 Dec 2020
Calibration of Pre-trained Transformers
Shrey Desai
Greg Durrett
UQLM
229
288
0
17 Mar 2020
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
220
3,054
0
23 Jan 2020
1