ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.12892
  4. Cited By
Quantifying and maximizing the information flux in recurrent neural
  networks

Quantifying and maximizing the information flux in recurrent neural networks

30 January 2023
C. Metzner
Marius E. Yamakou
Dennis Voelkl
A. Schilling
P. Krauss
ArXivPDFHTML

Papers citing "Quantifying and maximizing the information flux in recurrent neural networks"

4 / 4 papers shown
Title
Analysis of Argument Structure Constructions in the Large Language Model
  BERT
Analysis of Argument Structure Constructions in the Large Language Model BERT
Pegah Ramezani
Achim Schilling
Patrick Krauss
29
1
0
08 Aug 2024
Analysis of Argument Structure Constructions in a Deep Recurrent
  Language Model
Analysis of Argument Structure Constructions in a Deep Recurrent Language Model
Pegah Ramezani
Achim Schilling
Patrick Krauss
23
1
0
06 Aug 2024
Conceptual Cognitive Maps Formation with Neural Successor Networks and
  Word Embeddings
Conceptual Cognitive Maps Formation with Neural Successor Networks and Word Embeddings
Paul Stoewer
A. Schilling
Andreas K. Maier
P. Krauss
17
5
0
04 Jul 2023
Word class representations spontaneously emerge in a deep neural network
  trained on next word prediction
Word class representations spontaneously emerge in a deep neural network trained on next word prediction
K. Surendra
A. Schilling
Paul Stoewer
Andreas K. Maier
P. Krauss
NAI
8
9
0
15 Feb 2023
1