ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.01913
  4. Cited By
PDT: Pretrained Dual Transformers for Time-aware Bipartite Graphs
v1v2v3 (latest)

PDT: Pretrained Dual Transformers for Time-aware Bipartite Graphs

2 June 2023
Xinlan Dai
Yujie Fan
Zhongfang Zhuang
Shubham Jain
Chin-Chia Michael Yeh
Junpeng Wang
Liang Wang
Yan Zheng
Prince Osei Aboagye
Wei Zhang
ArXiv (abs)PDFHTML

Papers citing "PDT: Pretrained Dual Transformers for Time-aware Bipartite Graphs"

2 / 2 papers shown
Has Your Pretrained Model Improved? A Multi-head Posterior Based
  Approach
Has Your Pretrained Model Improved? A Multi-head Posterior Based Approach
Prince Osei Aboagye
Yan Zheng
Junpeng Wang
Uday Singh Saini
Xin Dai
...
Yujie Fan
Zhongfang Zhuang
Shubham Jain
Liang Wang
Wei Zhang
324
0
0
02 Jan 2024
S^3-Rec: Self-Supervised Learning for Sequential Recommendation with
  Mutual Information Maximization
S^3-Rec: Self-Supervised Learning for Sequential Recommendation with Mutual Information Maximization
Kun Zhou
Haibo Wang
Wayne Xin Zhao
Yutao Zhu
Sirui Wang
Fuzheng Zhang
Zhongyuan Wang
Ji-Rong Wen
363
1,016
0
18 Aug 2020
1
Page 1 of 1