ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.15709
37
0

TutorLLM: Customizing Learning Recommendations with Knowledge Tracing and Retrieval-Augmented Generation

20 January 2025
Zhaoxing Li
V. Yazdanpanah
Jindi Wang
Wen Gu
Lei Shi
Alexandra I. Cristea
Sarah Kiden
Sebastian Stein
    AI4Ed
    RALM
ArXivPDFHTML
Abstract

The integration of AI in education offers significant potential to enhance learning efficiency. Large Language Models (LLMs), such as ChatGPT, Gemini, and Llama, allow students to query a wide range of topics, providing unprecedented flexibility. However, LLMs face challenges, such as handling varying content relevance and lack of personalization. To address these challenges, we propose TutorLLM, a personalized learning recommender LLM system based on Knowledge Tracing (KT) and Retrieval-Augmented Generation (RAG). The novelty of TutorLLM lies in its unique combination of KT and RAG techniques with LLMs, which enables dynamic retrieval of context-specific knowledge and provides personalized learning recommendations based on the student's personal learning state. Specifically, this integration allows TutorLLM to tailor responses based on individual learning states predicted by the Multi-Features with Latent Relations BERT-based KT (MLFBK) model and to enhance response accuracy with a Scraper model. The evaluation includes user assessment questionnaires and performance metrics, demonstrating a 10% improvement in user satisfaction and a 5\% increase in quiz scores compared to using general LLMs alone.

View on arXiv
@article{li2025_2502.15709,
  title={ TutorLLM: Customizing Learning Recommendations with Knowledge Tracing and Retrieval-Augmented Generation },
  author={ Zhaoxing Li and Vahid Yazdanpanah and Jindi Wang and Wen Gu and Lei Shi and Alexandra I. Cristea and Sarah Kiden and Sebastian Stein },
  journal={arXiv preprint arXiv:2502.15709},
  year={ 2025 }
}
Comments on this paper