Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2405.06368
Cited By
DP-DyLoRA: Fine-Tuning Transformer-Based Models On-Device under Differentially Private Federated Learning using Dynamic Low-Rank Adaptation
10 May 2024
Jie Xu
Karthikeyan P. Saravanan
Rogier van Dalen
Haaris Mehmood
David Tuckey
Mete Ozay
Re-assign community
ArXiv
PDF
HTML
Papers citing
"DP-DyLoRA: Fine-Tuning Transformer-Based Models On-Device under Differentially Private Federated Learning using Dynamic Low-Rank Adaptation"
5 / 5 papers shown
Title
DP-MemArc: Differential Privacy Transfer Learning for Memory Efficient Language Models
Yanming Liu
Xinyue Peng
Yuwei Zhang
Xiaolan Ke
Songhang Deng
...
Sheng Cheng
Xun Wang
Jianwei Yin
Tianyu Du
Xuhong Zhang
64
0
0
21 Feb 2025
DEeR: Deviation Eliminating and Noise Regulating for Privacy-preserving Federated Low-rank Adaptation
Meilu Zhu
Axiu Mao
Jun Liu
Yixuan Yuan
19
1
0
16 Oct 2024
Federated LoRA with Sparse Communication
Kevin Kuo
Arian Raje
Kousik Rajesh
Virginia Smith
18
7
0
07 Jun 2024
Differentially Private Fine-tuning of Language Models
Da Yu
Saurabh Naik
A. Backurs
Sivakanth Gopi
Huseyin A. Inan
...
Y. Lee
Andre Manoel
Lukas Wutschitz
Sergey Yekhanin
Huishuai Zhang
128
258
0
13 Oct 2021
MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer
Sachin Mehta
Mohammad Rastegari
ViT
181
1,148
0
05 Oct 2021
1