Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2210.16947
Cited By
Two Models are Better than One: Federated Learning Is Not Private For Google GBoard Next Word Prediction
30 October 2022
Mohamed Suliman
D. Leith
SILM
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Two Models are Better than One: Federated Learning Is Not Private For Google GBoard Next Word Prediction"
6 / 6 papers shown
Title
Federated Learning in Practice: Reflections and Projections
Katharine Daly
Hubert Eichner
Peter Kairouz
H. B. McMahan
Daniel Ramage
Zheng Xu
FedML
53
5
0
11 Oct 2024
Confidential Federated Computations
Hubert Eichner
Daniel Ramage
Kallista A. Bonawitz
Dzmitry Huba
Tiziano Santoro
...
Albert Cheu
Katharine Daly
Adria Gascon
Marco Gruteser
Brendan McMahan
40
2
0
16 Apr 2024
Private Federated Learning with Autotuned Compression
Enayat Ullah
Christopher A. Choquette-Choo
Peter Kairouz
Sewoong Oh
FedML
15
6
0
20 Jul 2023
When the Curious Abandon Honesty: Federated Learning Is Not Private
Franziska Boenisch
Adam Dziedzic
R. Schuster
Ali Shahin Shamsabadi
Ilia Shumailov
Nicolas Papernot
FedML
AAML
69
181
0
06 Dec 2021
Practical and Private (Deep) Learning without Sampling or Shuffling
Peter Kairouz
Brendan McMahan
Shuang Song
Om Thakkar
Abhradeep Thakurta
Zheng Xu
FedML
180
154
0
26 Feb 2021
Extracting Training Data from Large Language Models
Nicholas Carlini
Florian Tramèr
Eric Wallace
Matthew Jagielski
Ariel Herbert-Voss
...
Tom B. Brown
D. Song
Ulfar Erlingsson
Alina Oprea
Colin Raffel
MLAU
SILM
290
1,814
0
14 Dec 2020
1