ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.11528
  4. Cited By
Improved Training Speed, Accuracy, and Data Utilization Through Loss
  Function Optimization

Improved Training Speed, Accuracy, and Data Utilization Through Loss Function Optimization

27 May 2019
Santiago Gonzalez
Risto Miikkulainen
ArXivPDFHTML

Papers citing "Improved Training Speed, Accuracy, and Data Utilization Through Loss Function Optimization"

35 / 35 papers shown
Title
GANetic Loss for Generative Adversarial Networks with a Focus on Medical
  Applications
GANetic Loss for Generative Adversarial Networks with a Focus on Medical Applications
S. Akhmedova
Nils Körber
GAN
MedIm
37
0
0
07 Jun 2024
Next Generation Loss Function for Image Classification
Next Generation Loss Function for Image Classification
S. Akhmedova
Nils Körber
VLM
32
2
0
19 Apr 2024
Evolving Loss Functions for Specific Image Augmentation Techniques
Evolving Loss Functions for Specific Image Augmentation Techniques
Brandon Morgan
Dean Frederick Hougen
35
0
0
09 Apr 2024
Fast and Efficient Local Search for Genetic Programming Based Loss
  Function Learning
Fast and Efficient Local Search for Genetic Programming Based Loss Function Learning
Christian Raymond
Qi Chen
Bing Xue
Mengjie Zhang
46
2
0
01 Mar 2024
Predicting O-GlcNAcylation Sites in Mammalian Proteins with Transformers
  and RNNs Trained with a New Loss Function
Predicting O-GlcNAcylation Sites in Mammalian Proteins with Transformers and RNNs Trained with a New Loss Function
Pedro Seber
27
2
0
27 Feb 2024
Neural Loss Function Evolution for Large-Scale Image Classifier
  Convolutional Neural Networks
Neural Loss Function Evolution for Large-Scale Image Classifier Convolutional Neural Networks
Brandon Morgan
Dean Frederick Hougen
14
2
0
30 Jan 2024
Meta-tuning Loss Functions and Data Augmentation for Few-shot Object
  Detection
Meta-tuning Loss Functions and Data Augmentation for Few-shot Object Detection
B. Demirel
Orhun Bugra Baran
R. G. Cinbis
39
10
0
24 Apr 2023
Online Loss Function Learning
Online Loss Function Learning
Christian Raymond
Qi Chen
Bing Xue
Mengjie Zhang
35
5
0
30 Jan 2023
Efficient Activation Function Optimization through Surrogate Modeling
Efficient Activation Function Optimization through Surrogate Modeling
G. Bingham
Risto Miikkulainen
16
2
0
13 Jan 2023
Compute-Efficient Deep Learning: Algorithmic Trends and Opportunities
Compute-Efficient Deep Learning: Algorithmic Trends and Opportunities
Brian Bartoldson
B. Kailkhura
Davis W. Blalock
31
47
0
13 Oct 2022
Learning Symbolic Model-Agnostic Loss Functions via Meta-Learning
Learning Symbolic Model-Agnostic Loss Functions via Meta-Learning
Christian Raymond
Qi Chen
Bing Xue
Mengjie Zhang
FedML
29
11
0
19 Sep 2022
Sharp-MAML: Sharpness-Aware Model-Agnostic Meta Learning
Sharp-MAML: Sharpness-Aware Model-Agnostic Meta Learning
Momin Abbas
Quan-Wu Xiao
Lisha Chen
Pin-Yu Chen
Tianyi Chen
21
78
0
08 Jun 2022
PolyLoss: A Polynomial Expansion Perspective of Classification Loss
  Functions
PolyLoss: A Polynomial Expansion Perspective of Classification Loss Functions
Zhaoqi Leng
Mingxing Tan
Chenxi Liu
E. D. Cubuk
Xiaojie Shi
Shuyang Cheng
Drago Anguelov
18
134
0
26 Apr 2022
Generating meta-learning tasks to evolve parametric loss for
  classification learning
Generating meta-learning tasks to evolve parametric loss for classification learning
Zhaoyang Hai
Xiabi Liu
Yuchen Ren
N. Q. Soomro
30
0
0
20 Nov 2021
GCCN: Global Context Convolutional Network
GCCN: Global Context Convolutional Network
Ali Hamdi
Flora D. Salim
D. Kim
12
1
0
22 Oct 2021
Signature-Graph Networks
Signature-Graph Networks
Ali Hamdi
Flora D. Salim
D. Kim
Xiaojun Chang
21
1
0
22 Oct 2021
Problem Learning: Towards the Free Will of Machines
Problem Learning: Towards the Free Will of Machines
Yongfeng Zhang
FaML
21
2
0
01 Sep 2021
Learning an Explicit Hyperparameter Prediction Function Conditioned on
  Tasks
Learning an Explicit Hyperparameter Prediction Function Conditioned on Tasks
Jun Shu
Deyu Meng
Zongben Xu
26
6
0
06 Jul 2021
Evolving parametrized Loss for Image Classification Learning on Small Datasets
Zhaoyang Hai
Xiabi Liu
16
0
0
15 Mar 2021
Population-Based Evolution Optimizes a Meta-Learning Objective
Population-Based Evolution Optimizes a Meta-Learning Objective
Kevin Frans
Olaf Witkowski
25
5
0
11 Mar 2021
Searching for Robustness: Loss Learning for Noisy Classification Tasks
Searching for Robustness: Loss Learning for Noisy Classification Tasks
Boyan Gao
Henry Gouk
Timothy M. Hospedales
OOD
NoLa
33
18
0
27 Feb 2021
Evolving GAN Formulations for Higher Quality Image Synthesis
Evolving GAN Formulations for Higher Quality Image Synthesis
Santiago Gonzalez
Mohak Kant
Risto Miikkulainen
GAN
23
4
0
17 Feb 2021
Loss Function Discovery for Object Detection via Convergence-Simulation
  Driven Search
Loss Function Discovery for Object Detection via Convergence-Simulation Driven Search
Peidong Liu
Gengwei Zhang
Bochao Wang
Hang Xu
Xiaodan Liang
Yong-jia Jiang
Zhenguo Li
13
28
0
09 Feb 2021
Effective Regularization Through Loss-Function Metalearning
Effective Regularization Through Loss-Function Metalearning
Santiago Gonzalez
Risto Miikkulainen
26
5
0
02 Oct 2020
Lights and Shadows in Evolutionary Deep Learning: Taxonomy, Critical
  Methodological Analysis, Cases of Study, Learned Lessons, Recommendations and
  Challenges
Lights and Shadows in Evolutionary Deep Learning: Taxonomy, Critical Methodological Analysis, Cases of Study, Learned Lessons, Recommendations and Challenges
Aritz D. Martinez
Javier Del Ser
Esther Villar-Rodriguez
E. Osaba
Javier Poyatos
Siham Tabik
Daniel Molina
Francisco Herrera
35
26
0
09 Aug 2020
flexgrid2vec: Learning Efficient Visual Representations Vectors
flexgrid2vec: Learning Efficient Visual Representations Vectors
Ali Hamdi
D. Kim
Flora D. Salim
SSL
GNN
30
7
0
30 Jul 2020
Discovering Parametric Activation Functions
Discovering Parametric Activation Functions
G. Bingham
Risto Miikkulainen
ODL
12
70
0
05 Jun 2020
Meta-Learning in Neural Networks: A Survey
Meta-Learning in Neural Networks: A Survey
Timothy M. Hospedales
Antreas Antoniou
P. Micaelli
Amos Storkey
OOD
55
1,935
0
11 Apr 2020
Meta-learning curiosity algorithms
Meta-learning curiosity algorithms
Ferran Alet
Martin Schneider
Tomas Lozano-Perez
L. Kaelbling
25
63
0
11 Mar 2020
Evolutionary Optimization of Deep Learning Activation Functions
Evolutionary Optimization of Deep Learning Activation Functions
G. Bingham
William Macke
Risto Miikkulainen
ODL
19
50
0
17 Feb 2020
Regularized Evolutionary Population-Based Training
Regularized Evolutionary Population-Based Training
J. Liang
Santiago Gonzalez
H. Shahrzad
Risto Miikkulainen
14
9
0
11 Feb 2020
FastGAE: Scalable Graph Autoencoders with Stochastic Subgraph Decoding
FastGAE: Scalable Graph Autoencoders with Stochastic Subgraph Decoding
Guillaume Salha-Galvan
Romain Hennequin
Jean-Baptiste Remy
Manuel Moussallam
Michalis Vazirgiannis
GNN
BDL
23
6
0
05 Feb 2020
Optimizing Loss Functions Through Multivariate Taylor Polynomial
  Parameterization
Optimizing Loss Functions Through Multivariate Taylor Polynomial Parameterization
Santiago Gonzalez
Risto Miikkulainen
16
9
0
31 Jan 2020
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
285
9,145
0
06 Jun 2015
Improving neural networks by preventing co-adaptation of feature
  detectors
Improving neural networks by preventing co-adaptation of feature detectors
Geoffrey E. Hinton
Nitish Srivastava
A. Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
VLM
266
7,638
0
03 Jul 2012
1