The Self-Learning Agent with a Progressive Neural Network Integrated Transformer

Abstract
This paper introduces a self-learning agent that integrates LLaMA 3.2 with a Progressive Neural Network (PNN) for continual learning in conversational AI and code generation. The framework dynamically collects data, fine-tunes tasks with minimal samples, and leverages Meta-Learning for rapid adaptation. LoRA optimizes fine-tuning, while Elastic Weight Consolidation (EWC) enhances knowledge retention. Experimental results demonstrate improved adaptability and memory stability, positioning this approach as a scalable step toward Artificial General Intelligence (AGI).
View on arXiv@article{sivakumar2025_2504.02489, title={ The Self-Learning Agent with a Progressive Neural Network Integrated Transformer }, author={ Ajay Sivakumar and Shalini and Vasantha Raj and Sebastian Sylvester }, journal={arXiv preprint arXiv:2504.02489}, year={ 2025 } }
Comments on this paper