ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.00852
32
0

A Transfer Framework for Enhancing Temporal Graph Learning in Data-Scarce Settings

2 March 2025
Sidharth Agarwal
Tanishq Dubey
Shubham Gupta
Srikanta J. Bedathur
    AI4TS
    AI4CE
ArXivPDFHTML
Abstract

Dynamic interactions between entities are prevalent in domains like social platforms, financial systems, healthcare, and e-commerce. These interactions can be effectively represented as time-evolving graphs, where predicting future connections is a key task in applications such as recommendation systems. Temporal Graph Neural Networks (TGNNs) have achieved strong results for such predictive tasks but typically require extensive training data, which is often limited in real-world scenarios. One approach to mitigating data scarcity is leveraging pre-trained models from related datasets. However, direct knowledge transfer between TGNNs is challenging due to their reliance on node-specific memory structures, making them inherently difficult to adapt across datasets.To address this, we introduce a novel transfer approach that disentangles node representations from their associated features through a structured bipartite encoding mechanism. This decoupling enables more effective transfer of memory components and other learned inductive patterns from one dataset to another. Empirical evaluations on real-world benchmarks demonstrate that our method significantly enhances TGNN performance in low-data regimes, outperforming non-transfer baselines by up to 56\% and surpassing existing transfer strategies by 36\%

View on arXiv
@article{agarwal2025_2503.00852,
  title={ A Transfer Framework for Enhancing Temporal Graph Learning in Data-Scarce Settings },
  author={ Sidharth Agarwal and Tanishq Dubey and Shubham Gupta and Srikanta Bedathur },
  journal={arXiv preprint arXiv:2503.00852},
  year={ 2025 }
}
Comments on this paper