ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.00917
  4. Cited By
Controllable Orthogonalization in Training DNNs

Controllable Orthogonalization in Training DNNs

2 April 2020
Lei Huang
Li Liu
Fan Zhu
Diwen Wan
Zehuan Yuan
Bo Li
Ling Shao
ArXivPDFHTML

Papers citing "Controllable Orthogonalization in Training DNNs"

18 / 18 papers shown
Title
1-Lipschitz Neural Networks are more expressive with N-Activations
1-Lipschitz Neural Networks are more expressive with N-Activations
Bernd Prach
Christoph H. Lampert
AAML
FAtt
24
0
0
10 Nov 2023
Localization using Multi-Focal Spatial Attention for Masked Face
  Recognition
Localization using Multi-Focal Spatial Attention for Masked Face Recognition
Samrudhdhi B. Rangrej
Hanbyel Cho
H. Hong
James J. Clark
Dongmin Cho
JungWoo Chang
Junmo Kim
CVBM
23
1
0
03 May 2023
Almost-Orthogonal Layers for Efficient General-Purpose Lipschitz
  Networks
Almost-Orthogonal Layers for Efficient General-Purpose Lipschitz Networks
Bernd Prach
Christoph H. Lampert
32
35
0
05 Aug 2022
Feedback Gradient Descent: Efficient and Stable Optimization with
  Orthogonality for DNNs
Feedback Gradient Descent: Efficient and Stable Optimization with Orthogonality for DNNs
Fanchen Bu
D. Chang
17
6
0
12 May 2022
Compact Model Training by Low-Rank Projection with Energy Transfer
Compact Model Training by Low-Rank Projection with Energy Transfer
K. Guo
Zhenquan Lin
Xiaofen Xing
Fang Liu
Xiangmin Xu
22
2
0
12 Apr 2022
projUNN: efficient method for training deep networks with unitary
  matrices
projUNN: efficient method for training deep networks with unitary matrices
B. Kiani
Randall Balestriero
Yann LeCun
S. Lloyd
36
32
0
10 Mar 2022
Adversarially Robust Models may not Transfer Better: Sufficient
  Conditions for Domain Transferability from the View of Regularization
Adversarially Robust Models may not Transfer Better: Sufficient Conditions for Domain Transferability from the View of Regularization
Xiaojun Xu
Jacky Y. Zhang
Evelyn Ma
Danny Son
Oluwasanmi Koyejo
Bo-wen Li
20
10
0
03 Feb 2022
Weight Evolution: Improving Deep Neural Networks Training through
  Evolving Inferior Weight Values
Weight Evolution: Improving Deep Neural Networks Training through Evolving Inferior Weight Values
Zhenquan Lin
K. Guo
Xiaofen Xing
Xiangmin Xu
ODL
24
1
0
09 Oct 2021
Orthogonal Graph Neural Networks
Orthogonal Graph Neural Networks
Kai Guo
Kaixiong Zhou
Xia Hu
Yu Li
Yi Chang
Xin Wang
43
34
0
23 Sep 2021
Existence, Stability and Scalability of Orthogonal Convolutional Neural
  Networks
Existence, Stability and Scalability of Orthogonal Convolutional Neural Networks
E. M. Achour
Franccois Malgouyres
Franck Mamalet
16
20
0
12 Aug 2021
Coordinate descent on the orthogonal group for recurrent neural network
  training
Coordinate descent on the orthogonal group for recurrent neural network training
E. Massart
V. Abrol
29
10
0
30 Jul 2021
Dirichlet Energy Constrained Learning for Deep Graph Neural Networks
Dirichlet Energy Constrained Learning for Deep Graph Neural Networks
Kaixiong Zhou
Xiao Shi Huang
Daochen Zha
Rui Chen
Li Li
Soo-Hyun Choi
Xia Hu
GNN
AI4CE
22
114
0
06 Jul 2021
Improving Unsupervised Domain Adaptation by Reducing Bi-level Feature
  Redundancy
Improving Unsupervised Domain Adaptation by Reducing Bi-level Feature Redundancy
Mengzhu Wang
Xiang Zhang
L. Lan
Wei Wang
Huibin Tan
Zhigang Luo
AI4CE
37
1
0
28 Dec 2020
Transform Quantization for CNN (Convolutional Neural Network)
  Compression
Transform Quantization for CNN (Convolutional Neural Network) Compression
Sean I. Young
Wang Zhe
David S. Taubman
B. Girod
MQ
29
69
0
02 Sep 2020
Deep Isometric Learning for Visual Recognition
Deep Isometric Learning for Visual Recognition
Haozhi Qi
Chong You
X. Wang
Yi-An Ma
Jitendra Malik
VLM
27
53
0
30 Jun 2020
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train
  10,000-Layer Vanilla Convolutional Neural Networks
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
220
348
0
14 Jun 2018
Optimization on Submanifolds of Convolution Kernels in CNNs
Optimization on Submanifolds of Convolution Kernels in CNNs
Mete Ozay
Takayuki Okatani
43
46
0
22 Oct 2016
Learning Unitary Operators with Help From u(n)
Learning Unitary Operators with Help From u(n)
Stephanie L. Hyland
Gunnar Rätsch
89
41
0
17 Jul 2016
1