ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.11841
34
0

GaVaMoE: Gaussian-Variational Gated Mixture of Experts for Explainable Recommendation

15 October 2024
Fei Tang
Yongliang Shen
Hang Zhang
Zeqi Tan
Wenqi Zhang
Guiyang Hou
Kaitao Song
Weiming Lu
Yueting Zhuang
ArXivPDFHTML
Abstract

Large language model-based explainable recommendation (LLM-based ER) systems show promise in generating human-like explanations for recommendations. However, they face challenges in modeling user-item collaborative preferences, personalizing explanations, and handling sparse user-item interactions. To address these issues, we propose GaVaMoE, a novel Gaussian-Variational Gated Mixture of Experts framework for explainable recommendation. GaVaMoE introduces two key components: (1) a rating reconstruction module that employs Variational Autoencoder (VAE) with a Gaussian Mixture Model (GMM) to capture complex user-item collaborative preferences, serving as a pre-trained multi-gating mechanism; and (2) a set of fine-grained expert models coupled with the multi-gating mechanism for generating highly personalized explanations. The VAE component models latent factors in user-item interactions, while the GMM clusters users with similar behaviors. Each cluster corresponds to a gate in the multi-gating mechanism, routing user-item pairs to appropriate expert models. This architecture enables GaVaMoE to generate tailored explanations for specific user types and preferences, mitigating data sparsity by leveraging user similarities. Extensive experiments on three real-world datasets demonstrate that GaVaMoE significantly outperforms existing methods in explanation quality, personalization, and consistency. Notably, GaVaMoE exhibits robust performance in scenarios with sparse user-item interactions, maintaining high-quality explanations even for users with limited historical data.

View on arXiv
@article{tang2025_2410.11841,
  title={ GaVaMoE: Gaussian-Variational Gated Mixture of Experts for Explainable Recommendation },
  author={ Fei Tang and Yongliang Shen and Hang Zhang and Zeqi Tan and Wenqi Zhang and Zhibiao Huang and Kaitao Song and Weiming Lu and Yueting Zhuang },
  journal={arXiv preprint arXiv:2410.11841},
  year={ 2025 }
}
Comments on this paper