ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.01033
  4. Cited By
Neural Networks Trained by Weight Permutation are Universal Approximators

Neural Networks Trained by Weight Permutation are Universal Approximators

1 July 2024
Yongqiang Cai
Gaohang Chen
Zhonghua Qiao
ArXivPDFHTML

Papers citing "Neural Networks Trained by Weight Permutation are Universal Approximators"

4 / 4 papers shown
Title
From Task-Specific Models to Unified Systems: A Review of Model Merging Approaches
Wei Ruan
Tianze Yang
Y. Zhou
Tianming Liu
Jin Lu
MoMe
88
0
0
13 Mar 2025
Git Re-Basin: Merging Models modulo Permutation Symmetries
Git Re-Basin: Merging Models modulo Permutation Symmetries
Samuel K. Ainsworth
J. Hayase
S. Srinivasa
MoMe
239
313
0
11 Sep 2022
Optimal Approximation Rate of ReLU Networks in terms of Width and Depth
Optimal Approximation Rate of ReLU Networks in terms of Width and Depth
Zuowei Shen
Haizhao Yang
Shijun Zhang
79
114
0
28 Feb 2021
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
121
600
0
14 Feb 2016
1