ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.06887
  4. Cited By
Provable Multi-Task Representation Learning by Two-Layer ReLU Neural
  Networks

Provable Multi-Task Representation Learning by Two-Layer ReLU Neural Networks

13 July 2023
Liam Collins
Hamed Hassani
Mahdi Soltanolkotabi
Aryan Mokhtari
Sanjay Shakkottai
ArXivPDFHTML

Papers citing "Provable Multi-Task Representation Learning by Two-Layer ReLU Neural Networks"

13 / 13 papers shown
Title
Evidential Uncertainty Probes for Graph Neural Networks
Linlin Yu
Kangshuo Li
Pritom Kumar Saha
Yifei Lou
Feng Chen
EDL
UQCV
77
0
0
11 Mar 2025
Meta-learning of shared linear representations beyond well-specified linear regression
Meta-learning of shared linear representations beyond well-specified linear regression
Mathieu Even
Laurent Massoulié
42
0
0
31 Jan 2025
Pretrained transformer efficiently learns low-dimensional target
  functions in-context
Pretrained transformer efficiently learns low-dimensional target functions in-context
Kazusato Oko
Yujin Song
Taiji Suzuki
Denny Wu
31
4
0
04 Nov 2024
The Effects of Multi-Task Learning on ReLU Neural Network Functions
The Effects of Multi-Task Learning on ReLU Neural Network Functions
Julia B. Nakhleh
Joseph Shenouda
Robert D. Nowak
25
1
0
29 Oct 2024
Robust Feature Learning for Multi-Index Models in High Dimensions
Robust Feature Learning for Multi-Index Models in High Dimensions
Alireza Mousavi-Hosseini
Adel Javanmard
Murat A. Erdogdu
OOD
AAML
39
1
0
21 Oct 2024
Guarantees for Nonlinear Representation Learning: Non-identical
  Covariates, Dependent Data, Fewer Samples
Guarantees for Nonlinear Representation Learning: Non-identical Covariates, Dependent Data, Fewer Samples
Thomas T. Zhang
Bruce D. Lee
Ingvar M. Ziemann
George J. Pappas
Nikolai Matni
CML
OOD
31
0
0
15 Oct 2024
Meta-Learning Operators to Optimality from Multi-Task Non-IID Data
Meta-Learning Operators to Optimality from Multi-Task Non-IID Data
Thomas T. Zhang
Leonardo F. Toso
James Anderson
Nikolai Matni
67
13
0
08 Aug 2023
Sparks of Artificial General Intelligence: Early experiments with GPT-4
Sparks of Artificial General Intelligence: Early experiments with GPT-4
Sébastien Bubeck
Varun Chandrasekaran
Ronen Eldan
J. Gehrke
Eric Horvitz
...
Scott M. Lundberg
Harsha Nori
Hamid Palangi
Marco Tulio Ribeiro
Yi Zhang
ELM
AI4MH
AI4CE
ALM
206
2,232
0
22 Mar 2023
Neural Networks Efficiently Learn Low-Dimensional Representations with
  SGD
Neural Networks Efficiently Learn Low-Dimensional Representations with SGD
Alireza Mousavi-Hosseini
Sejun Park
M. Girotti
Ioannis Mitliagkas
Murat A. Erdogdu
MLT
316
48
0
29 Sep 2022
Active Multi-Task Representation Learning
Active Multi-Task Representation Learning
Yifang Chen
S. Du
Kevin G. Jamieson
28
12
0
02 Feb 2022
Multitask Prompted Training Enables Zero-Shot Task Generalization
Multitask Prompted Training Enables Zero-Shot Task Generalization
Victor Sanh
Albert Webson
Colin Raffel
Stephen H. Bach
Lintang Sutawika
...
T. Bers
Stella Biderman
Leo Gao
Thomas Wolf
Alexander M. Rush
LRM
203
1,651
0
15 Oct 2021
A Local Convergence Theory for Mildly Over-Parameterized Two-Layer
  Neural Network
A Local Convergence Theory for Mildly Over-Parameterized Two-Layer Neural Network
Mo Zhou
Rong Ge
Chi Jin
67
44
0
04 Feb 2021
The large learning rate phase of deep learning: the catapult mechanism
The large learning rate phase of deep learning: the catapult mechanism
Aitor Lewkowycz
Yasaman Bahri
Ethan Dyer
Jascha Narain Sohl-Dickstein
Guy Gur-Ari
ODL
150
232
0
04 Mar 2020
1