ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1502.03436
  4. Cited By
An exploration of parameter redundancy in deep networks with circulant
  projections

An exploration of parameter redundancy in deep networks with circulant projections

11 February 2015
Yu Cheng
Felix X. Yu
Rogerio Feris
Sanjiv Kumar
A. Choudhary
Shih-Fu Chang
ArXivPDFHTML

Papers citing "An exploration of parameter redundancy in deep networks with circulant projections"

6 / 6 papers shown
Title
Low-rank Tensor Decomposition for Compression of Convolutional Neural
  Networks Using Funnel Regularization
Low-rank Tensor Decomposition for Compression of Convolutional Neural Networks Using Funnel Regularization
Bo-Shiuan Chu
Che-Rung Lee
26
11
0
07 Dec 2021
Reliable Identification of Redundant Kernels for Convolutional Neural
  Network Compression
Reliable Identification of Redundant Kernels for Convolutional Neural Network Compression
Wei Wang
Liqiang Zhu
CVBM
20
13
0
10 Dec 2018
Fast and Accurate Person Re-Identification with RMNet
Fast and Accurate Person Re-Identification with RMNet
Evgeny Izutov
3DH
32
6
0
06 Dec 2018
Structured Transforms for Small-Footprint Deep Learning
Structured Transforms for Small-Footprint Deep Learning
Vikas Sindhwani
Tara N. Sainath
Sanjiv Kumar
21
240
0
06 Oct 2015
Compact Nonlinear Maps and Circulant Extensions
Compact Nonlinear Maps and Circulant Extensions
Felix X. Yu
Sanjiv Kumar
H. Rowley
Shih-Fu Chang
21
47
0
12 Mar 2015
Improving neural networks by preventing co-adaptation of feature
  detectors
Improving neural networks by preventing co-adaptation of feature detectors
Geoffrey E. Hinton
Nitish Srivastava
A. Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
VLM
266
7,638
0
03 Jul 2012
1