ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1810.06773
  4. Cited By
Evolutionary Stochastic Gradient Descent for Optimization of Deep Neural
  Networks

Evolutionary Stochastic Gradient Descent for Optimization of Deep Neural Networks

16 October 2018
Xiaodong Cui
Wei Zhang
Zoltán Tüske
M. Picheny
    ODL
ArXivPDFHTML

Papers citing "Evolutionary Stochastic Gradient Descent for Optimization of Deep Neural Networks"

5 / 5 papers shown
Title
Perception-Distortion Balanced Super-Resolution: A Multi-Objective
  Optimization Perspective
Perception-Distortion Balanced Super-Resolution: A Multi-Objective Optimization Perspective
Lingchen Sun
Jiejing Liang
Shuai Liu
Hongwei Yong
Lei Zhang
25
6
0
24 Dec 2023
Optimizing Neural Networks with Gradient Lexicase Selection
Optimizing Neural Networks with Gradient Lexicase Selection
Lijie Ding
Lee Spector
40
20
0
19 Dec 2023
Ever Evolving Evaluator (EV3): Towards Flexible and Reliable
  Meta-Optimization for Knowledge Distillation
Ever Evolving Evaluator (EV3): Towards Flexible and Reliable Meta-Optimization for Knowledge Distillation
Li Ding
M. Zoghi
Guy Tennenholtz
Maryam Karimzadehgan
20
0
0
29 Oct 2023
Genetically Modified Wolf Optimization with Stochastic Gradient Descent
  for Optimising Deep Neural Networks
Genetically Modified Wolf Optimization with Stochastic Gradient Descent for Optimising Deep Neural Networks
Manuel Bradicic
M. Sitarz
Felix Sylvest Olesen
14
0
0
21 Jan 2023
Demystifying Parallel and Distributed Deep Learning: An In-Depth
  Concurrency Analysis
Demystifying Parallel and Distributed Deep Learning: An In-Depth Concurrency Analysis
Tal Ben-Nun
Torsten Hoefler
GNN
30
701
0
26 Feb 2018
1