ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2112.15577
  4. Cited By
How Infinitely Wide Neural Networks Can Benefit from Multi-task Learning
  -- an Exact Macroscopic Characterization
v1v2v3v4 (latest)

How Infinitely Wide Neural Networks Can Benefit from Multi-task Learning -- an Exact Macroscopic Characterization

31 December 2021
Jakob Heiss
Josef Teichmann
Hanna Wutte
    MLT
ArXiv (abs)PDFHTML

Papers citing "How Infinitely Wide Neural Networks Can Benefit from Multi-task Learning -- an Exact Macroscopic Characterization"

2 / 2 papers shown
Title
Extending Path-Dependent NJ-ODEs to Noisy Observations and a Dependent
  Observation Framework
Extending Path-Dependent NJ-ODEs to Noisy Observations and a Dependent Observation Framework
William Andersson
Jakob Heiss
Florian Krach
Josef Teichmann
70
2
0
24 Jul 2023
NOMU: Neural Optimization-based Model Uncertainty
NOMU: Neural Optimization-based Model Uncertainty
Jakob Heiss
Jakob Weissteiner
Hanna Wutte
Sven Seuken
Josef Teichmann
BDL
92
20
0
26 Feb 2021
1