ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.01342
  4. Cited By
Activation by Interval-wise Dropout: A Simple Way to Prevent Neural Networks from Plasticity Loss

Activation by Interval-wise Dropout: A Simple Way to Prevent Neural Networks from Plasticity Loss

3 February 2025
Sangyeon Park
Isaac Han
Seungwon Oh
Kyung-Joong Kim
ArXivPDFHTML

Papers citing "Activation by Interval-wise Dropout: A Simple Way to Prevent Neural Networks from Plasticity Loss"

2 / 2 papers shown
Title
Plasticine: Accelerating Research in Plasticity-Motivated Deep Reinforcement Learning
Plasticine: Accelerating Research in Plasticity-Motivated Deep Reinforcement Learning
Mingqi Yuan
Qi Wang
Guozheng Ma
Bo-wen Li
Xin Jin
Yunbo Wang
Xiaokang Yang
Wenjun Zeng
D. Tao
OffRL
AI4CE
33
0
0
24 Apr 2025
Neuroplasticity in Artificial Intelligence -- An Overview and Inspirations on Drop In & Out Learning
Neuroplasticity in Artificial Intelligence -- An Overview and Inspirations on Drop In & Out Learning
Yupei Li
M. Milling
Björn Schuller
AI4CE
102
0
0
27 Mar 2025
1