ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1808.07172
  4. Cited By
Fisher Information and Natural Gradient Learning of Random Deep Networks

Fisher Information and Natural Gradient Learning of Random Deep Networks

22 August 2018
S. Amari
Ryo Karakida
Masafumi Oizumi
ArXivPDFHTML

Papers citing "Fisher Information and Natural Gradient Learning of Random Deep Networks"

3 / 3 papers shown
Title
Are queries and keys always relevant? A case study on Transformer wave functions
Are queries and keys always relevant? A case study on Transformer wave functions
Riccardo Rende
Luciano Loris Viteritti
24
5
0
29 May 2024
Component-Wise Natural Gradient Descent -- An Efficient Neural Network
  Optimization
Component-Wise Natural Gradient Descent -- An Efficient Neural Network Optimization
Tran van Sang
Mhd Irvan
R. Yamaguchi
Toshiyuki Nakata
13
1
0
11 Oct 2022
Any Target Function Exists in a Neighborhood of Any Sufficiently Wide
  Random Network: A Geometrical Perspective
Any Target Function Exists in a Neighborhood of Any Sufficiently Wide Random Network: A Geometrical Perspective
S. Amari
21
12
0
20 Jan 2020
1