ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1810.03023
  4. Cited By
h-detach: Modifying the LSTM Gradient Towards Better Optimization

h-detach: Modifying the LSTM Gradient Towards Better Optimization

6 October 2018
Devansh Arpit
Bhargav Kanuparthi
Giancarlo Kerg
Nan Rosemary Ke
Ioannis Mitliagkas
Yoshua Bengio
ArXivPDFHTML

Papers citing "h-detach: Modifying the LSTM Gradient Towards Better Optimization"

2 / 2 papers shown
Title
Preventing posterior collapse in variational autoencoders for text
  generation via decoder regularization
Preventing posterior collapse in variational autoencoders for text generation via decoder regularization
Alban Petit
Caio Corro
DRL
13
3
0
28 Oct 2021
SCL: Towards Accurate Domain Adaptive Object Detection via Gradient
  Detach Based Stacked Complementary Losses
SCL: Towards Accurate Domain Adaptive Object Detection via Gradient Detach Based Stacked Complementary Losses
Zhiqiang Shen
Harsh Maheshwari
Weichen Yao
Marios Savvides
ObjD
14
93
0
06 Nov 2019
1