ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.12330
  4. Cited By
Hidden-Fold Networks: Random Recurrent Residuals Using Sparse Supermasks

Hidden-Fold Networks: Random Recurrent Residuals Using Sparse Supermasks

24 November 2021
Ángel López García-Arias
Masanori Hashimoto
Masato Motomura
Jaehoon Yu
ArXivPDFHTML

Papers citing "Hidden-Fold Networks: Random Recurrent Residuals Using Sparse Supermasks"

4 / 4 papers shown
Title
Expressivity of Neural Networks with Random Weights and Learned Biases
Expressivity of Neural Networks with Random Weights and Learned Biases
Ezekiel Williams
Avery Hee-Woon Ryoo
Thomas Jiralerspong
Alexandre Payeur
M. Perich
Luca Mazzucato
Guillaume Lajoie
29
2
0
01 Jul 2024
Multicoated and Folded Graph Neural Networks with Strong Lottery Tickets
Multicoated and Folded Graph Neural Networks with Strong Lottery Tickets
Jiale Yan
Hiroaki Ito
Ángel López García-Arias
Yasuyuki Okoshi
Hikari Otsuka
Kazushi Kawamura
Thiem Van Chu
Masato Motomura
25
1
0
06 Dec 2023
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,549
0
17 Apr 2017
Bridging the Gaps Between Residual Learning, Recurrent Neural Networks
  and Visual Cortex
Bridging the Gaps Between Residual Learning, Recurrent Neural Networks and Visual Cortex
Q. Liao
T. Poggio
206
255
0
13 Apr 2016
1