ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1903.11257
  4. Cited By
How Can We Be So Dense? The Benefits of Using Highly Sparse
  Representations

How Can We Be So Dense? The Benefits of Using Highly Sparse Representations

27 March 2019
Subutai Ahmad
Luiz Scheinkman
ArXivPDFHTML

Papers citing "How Can We Be So Dense? The Benefits of Using Highly Sparse Representations"

18 / 18 papers shown
Title
Information Consistent Pruning: How to Efficiently Search for Sparse Networks?
Soheil Gharatappeh
S. Y. Sekeh
54
0
0
28 Jan 2025
Sparsing Law: Towards Large Language Models with Greater Activation Sparsity
Sparsing Law: Towards Large Language Models with Greater Activation Sparsity
Yuqi Luo
Chenyang Song
Xu Han
Y. Chen
Chaojun Xiao
Zhiyuan Liu
Maosong Sun
47
3
0
04 Nov 2024
Hard ASH: Sparsity and the right optimizer make a continual learner
Hard ASH: Sparsity and the right optimizer make a continual learner
Santtu Keskinen
CLL
37
1
0
26 Apr 2024
The Construction of Reality in an AI: A Review
The Construction of Reality in an AI: A Review
J. W. Johnston
3DV
13
1
0
03 Feb 2023
Competitive learning to generate sparse representations for associative
  memory
Competitive learning to generate sparse representations for associative memory
Luis Sa-Couto
Andreas Wichert
12
6
0
05 Jan 2023
Spatial Mixture-of-Experts
Spatial Mixture-of-Experts
Nikoli Dryden
Torsten Hoefler
MoE
24
9
0
24 Nov 2022
Cyclegan Network for Sheet Metal Welding Drawing Translation
Cyclegan Network for Sheet Metal Welding Drawing Translation
Zhiwei Song
Hui Yao
Dan Tian
Gaohui Zhan
GAN
AI4CE
17
1
0
28 Sep 2022
Extremely Simple Activation Shaping for Out-of-Distribution Detection
Extremely Simple Activation Shaping for Out-of-Distribution Detection
Andrija Djurisic
Nebojsa Bozanic
Arjun Ashok
Rosanne Liu
OODD
164
150
0
20 Sep 2022
The Role Of Biology In Deep Learning
The Role Of Biology In Deep Learning
Robert Bain
22
0
0
07 Sep 2022
Context-sensitive neocortical neurons transform the effectiveness and
  efficiency of neural information processing
Context-sensitive neocortical neurons transform the effectiveness and efficiency of neural information processing
Ahsan Adeel
Mario Franco
Mohsin Raza
K. Ahmed
23
9
0
15 Jul 2022
Sparse Double Descent: Where Network Pruning Aggravates Overfitting
Sparse Double Descent: Where Network Pruning Aggravates Overfitting
Zhengqi He
Zeke Xie
Quanzhi Zhu
Zengchang Qin
67
27
0
17 Jun 2022
Avoiding Catastrophe: Active Dendrites Enable Multi-Task Learning in
  Dynamic Environments
Avoiding Catastrophe: Active Dendrites Enable Multi-Task Learning in Dynamic Environments
A. Iyer
Karan Grewal
Akash Velu
Lucas O. Souza
Jérémy Forest
Subutai Ahmad
AI4CE
24
41
0
31 Dec 2021
Two Sparsities Are Better Than One: Unlocking the Performance Benefits
  of Sparse-Sparse Networks
Two Sparsities Are Better Than One: Unlocking the Performance Benefits of Sparse-Sparse Networks
Kevin Lee Hunter
Lawrence Spracklen
Subutai Ahmad
21
20
0
27 Dec 2021
Training Deep Spiking Auto-encoders without Bursting or Dying Neurons
  through Regularization
Training Deep Spiking Auto-encoders without Bursting or Dying Neurons through Regularization
Justus F. Hübotter
Pablo Lanillos
Jakub M. Tomczak
11
3
0
22 Sep 2021
Neural network relief: a pruning algorithm based on neural activity
Neural network relief: a pruning algorithm based on neural activity
Aleksandr Dekhovich
David Tax
M. Sluiter
Miguel A. Bessa
43
10
0
22 Sep 2021
On Incorporating Inductive Biases into VAEs
On Incorporating Inductive Biases into VAEs
Ning Miao
Emile Mathieu
N. Siddharth
Yee Whye Teh
Tom Rainforth
CML
DRL
22
10
0
25 Jun 2021
A brain basis of dynamical intelligence for AI and computational
  neuroscience
A brain basis of dynamical intelligence for AI and computational neuroscience
J. Monaco
Kanaka Rajan
Grace M. Hwang
AI4CE
24
6
0
15 May 2021
Ultra-High Dimensional Sparse Representations with Binarization for
  Efficient Text Retrieval
Ultra-High Dimensional Sparse Representations with Binarization for Efficient Text Retrieval
Kyoung-Rok Jang
Junmo Kang
Giwon Hong
Sung-Hyon Myaeng
Joohee Park
Taewon Yoon
Heecheol Seo
28
20
0
15 Apr 2021
1