ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.06296
  4. Cited By
The Role of Permutation Invariance in Linear Mode Connectivity of Neural
  Networks
v1v2 (latest)

The Role of Permutation Invariance in Linear Mode Connectivity of Neural Networks

International Conference on Learning Representations (ICLR), 2021
12 October 2021
R. Entezari
Hanie Sedghi
O. Saukh
Behnam Neyshabur
    MoMe
ArXiv (abs)PDFHTML

Papers citing "The Role of Permutation Invariance in Linear Mode Connectivity of Neural Networks"

12 / 212 papers shown
Title
Git Re-Basin: Merging Models modulo Permutation Symmetries
Git Re-Basin: Merging Models modulo Permutation SymmetriesInternational Conference on Learning Representations (ICLR), 2022
Samuel K. Ainsworth
J. Hayase
S. Srinivasa
MoMe
725
410
0
11 Sep 2022
Patching open-vocabulary models by interpolating weights
Patching open-vocabulary models by interpolating weightsNeural Information Processing Systems (NeurIPS), 2022
Gabriel Ilharco
Mitchell Wortsman
S. Gadre
Shuran Song
Hannaneh Hajishirzi
Simon Kornblith
Ali Farhadi
Ludwig Schmidt
VLMKELM
275
200
0
10 Aug 2022
Branch-Train-Merge: Embarrassingly Parallel Training of Expert Language
  Models
Branch-Train-Merge: Embarrassingly Parallel Training of Expert Language Models
Margaret Li
Suchin Gururangan
Tim Dettmers
M. Lewis
Tim Althoff
Noah A. Smith
Luke Zettlemoyer
MoMe
218
176
0
05 Aug 2022
Geometrically Guided Integrated Gradients
Geometrically Guided Integrated Gradients
Md. Mahfuzur Rahman
N. Lewis
Sergey Plis
FAttAAML
76
0
0
13 Jun 2022
Trajectory-dependent Generalization Bounds for Deep Neural Networks via
  Fractional Brownian Motion
Trajectory-dependent Generalization Bounds for Deep Neural Networks via Fractional Brownian Motion
Chengli Tan
Jiang Zhang
Junmin Liu
171
1
0
09 Jun 2022
Lottery Tickets on a Data Diet: Finding Initializations with Sparse
  Trainable Networks
Lottery Tickets on a Data Diet: Finding Initializations with Sparse Trainable NetworksNeural Information Processing Systems (NeurIPS), 2022
Mansheej Paul
Brett W. Larsen
Surya Ganguli
Jonathan Frankle
Gintare Karolina Dziugaite
108
24
0
02 Jun 2022
Feature Space Particle Inference for Neural Network Ensembles
Feature Space Particle Inference for Neural Network EnsemblesInternational Conference on Machine Learning (ICML), 2022
Shingo Yashima
Teppei Suzuki
Kohta Ishikawa
Ikuro Sato
Rei Kawakami
BDL
172
12
0
02 Jun 2022
On the Symmetries of Deep Learning Models and their Internal
  Representations
On the Symmetries of Deep Learning Models and their Internal RepresentationsNeural Information Processing Systems (NeurIPS), 2022
Charles Godfrey
Davis Brown
Tegan H. Emerson
Henry Kvinge
236
56
0
27 May 2022
Linear Connectivity Reveals Generalization Strategies
Linear Connectivity Reveals Generalization StrategiesInternational Conference on Learning Representations (ICLR), 2022
Jeevesh Juneja
Rachit Bansal
Kyunghyun Cho
João Sedoc
Naomi Saphra
665
54
0
24 May 2022
Fusing finetuned models for better pretraining
Fusing finetuned models for better pretraining
Leshem Choshen
Elad Venezian
Noam Slonim
Yoav Katz
FedMLAI4CEMoMe
281
108
0
06 Apr 2022
Deep Networks on Toroids: Removing Symmetries Reveals the Structure of
  Flat Regions in the Landscape Geometry
Deep Networks on Toroids: Removing Symmetries Reveals the Structure of Flat Regions in the Landscape GeometryInternational Conference on Machine Learning (ICML), 2022
Fabrizio Pittorino
Antonio Ferraro
Gabriele Perugini
Christoph Feinauer
Carlo Baldassi
R. Zecchina
456
28
0
07 Feb 2022
Noether's Learning Dynamics: Role of Symmetry Breaking in Neural
  Networks
Noether's Learning Dynamics: Role of Symmetry Breaking in Neural NetworksNeural Information Processing Systems (NeurIPS), 2021
Hidenori Tanaka
D. Kunin
235
38
0
06 May 2021
Previous
12345