ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.04297
  4. Cited By
Learning to Modulate Random Weights: Neuromodulation-inspired Neural
  Networks For Efficient Continual Learning

Learning to Modulate Random Weights: Neuromodulation-inspired Neural Networks For Efficient Continual Learning

8 April 2022
Jinyung Hong
Theodore P. Pavlic
    CLL
ArXivPDFHTML

Papers citing "Learning to Modulate Random Weights: Neuromodulation-inspired Neural Networks For Efficient Continual Learning"

2 / 2 papers shown
Title
Brain-Inspired Continual Learning-Robust Feature Distillation and
  Re-Consolidation for Class Incremental Learning
Brain-Inspired Continual Learning-Robust Feature Distillation and Re-Consolidation for Class Incremental Learning
Hikmat Khan
N. Bouaynaya
Ghulam Rasool
CLL
38
1
0
22 Apr 2024
An Insect-Inspired Randomly, Weighted Neural Network with Random Fourier
  Features For Neuro-Symbolic Relational Learning
An Insect-Inspired Randomly, Weighted Neural Network with Random Fourier Features For Neuro-Symbolic Relational Learning
Jinyung Hong
Theodore P. Pavlic
24
5
0
11 Sep 2021
1