ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.03060
  4. Cited By
FLOPs as a Direct Optimization Objective for Learning Sparse Neural
  Networks
v1v2 (latest)

FLOPs as a Direct Optimization Objective for Learning Sparse Neural Networks

7 November 2018
Gautam Bhattacharya
Ashutosh Adhikari
Md. Jahangir Alam
ArXiv (abs)PDFHTML

Papers citing "FLOPs as a Direct Optimization Objective for Learning Sparse Neural Networks"

13 / 13 papers shown
H-FLTN: A Privacy-Preserving Hierarchical Framework for Electric Vehicle Spatio-Temporal Charge Prediction
H-FLTN: A Privacy-Preserving Hierarchical Framework for Electric Vehicle Spatio-Temporal Charge Prediction
Robert Marlin
Raja Jurdak
Alsharif Abuadbba
262
0
0
25 Feb 2025
Kolmogorov-Arnold Fourier Networks
Kolmogorov-Arnold Fourier Networks
Jusheng Zhang
Yijia Fan
Kaitong Cai
Keze Wang
306
12
0
09 Feb 2025
FALCON: FLOP-Aware Combinatorial Optimization for Neural Network Pruning
FALCON: FLOP-Aware Combinatorial Optimization for Neural Network PruningInternational Conference on Artificial Intelligence and Statistics (AISTATS), 2024
Xiang Meng
Wenyu Chen
Riade Benbaki
Rahul Mazumder
160
9
0
11 Mar 2024
REFT: Resource-Efficient Federated Training Framework for Heterogeneous
  and Resource-Constrained Environments
REFT: Resource-Efficient Federated Training Framework for Heterogeneous and Resource-Constrained Environments
Humaid Ahmed Desai
Amr B. Hilal
Hoda Eldardiry
159
2
0
25 Aug 2023
Practical Conformer: Optimizing size, speed and flops of Conformer for
  on-Device and cloud ASR
Practical Conformer: Optimizing size, speed and flops of Conformer for on-Device and cloud ASR
Rami Botros
Anmol Gulati
Tara N. Sainath
K. Choromanski
Ruoming Pang
Trevor Strohman
Weiran Wang
Jiahui Yu
MQ
253
3
0
31 Mar 2023
Robust and Resource-efficient Machine Learning Aided Viewport Prediction
  in Virtual Reality
Robust and Resource-efficient Machine Learning Aided Viewport Prediction in Virtual Reality
Yuang Jiang
Konstantinos Poularakis
Diego Kiedanski
S. Kompella
Leandros Tassiulas
112
1
0
20 Dec 2022
Dirichlet Pruning for Neural Network Compression
Dirichlet Pruning for Neural Network Compression
Kamil Adamczewski
Mijung Park
207
5
0
10 Nov 2020
Length-Adaptive Transformer: Train Once with Length Drop, Use Anytime
  with Search
Length-Adaptive Transformer: Train Once with Length Drop, Use Anytime with Search
Gyuwan Kim
Dong Wang
385
107
0
14 Oct 2020
Model Pruning Enables Efficient Federated Learning on Edge Devices
Model Pruning Enables Efficient Federated Learning on Edge DevicesIEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2019
Yuang Jiang
Maroun Touma
Victor Valls
Bongjun Ko
Yan Koyfman
Kin K. Leung
Leandros Tassiulas
542
578
0
26 Sep 2019
Neuron ranking -- an informed way to condense convolutional neural
  networks architecture
Neuron ranking -- an informed way to condense convolutional neural networks architecture
Kamil Adamczewski
Mijung Park
FAtt
143
3
0
03 Jul 2019
Distilling Task-Specific Knowledge from BERT into Simple Neural Networks
Distilling Task-Specific Knowledge from BERT into Simple Neural Networks
Raphael Tang
Yao Lu
Linqing Liu
Lili Mou
Olga Vechtomova
Jimmy J. Lin
237
449
0
28 Mar 2019
Radial and Directional Posteriors for Bayesian Neural Networks
Radial and Directional Posteriors for Bayesian Neural Networks
Changyong Oh
Kamil Adamczewski
Mijung Park
BDL
221
21
0
07 Feb 2019
Fast On-the-fly Retraining-free Sparsification of Convolutional Neural
  Networks
Fast On-the-fly Retraining-free Sparsification of Convolutional Neural Networks
Amir H. Ashouri
T. Abdelrahman
Alwyn Dos Remedios
MQ
270
14
0
10 Nov 2018
1
Page 1 of 1