Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2407.01033
Cited By
Neural Networks Trained by Weight Permutation are Universal Approximators
1 July 2024
Yongqiang Cai
Gaohang Chen
Zhonghua Qiao
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Neural Networks Trained by Weight Permutation are Universal Approximators"
4 / 4 papers shown
Title
From Task-Specific Models to Unified Systems: A Review of Model Merging Approaches
Wei Ruan
Tianze Yang
Y. Zhou
Tianming Liu
Jin Lu
MoMe
88
0
0
13 Mar 2025
Git Re-Basin: Merging Models modulo Permutation Symmetries
Samuel K. Ainsworth
J. Hayase
S. Srinivasa
MoMe
239
313
0
11 Sep 2022
Optimal Approximation Rate of ReLU Networks in terms of Width and Depth
Zuowei Shen
Haizhao Yang
Shijun Zhang
79
114
0
28 Feb 2021
Benefits of depth in neural networks
Matus Telgarsky
121
600
0
14 Feb 2016
1