215
v1v2v3v4v5 (latest)

A Dual-Directional Context-Aware Test-Time Learning for Text Classification

International Conference on Intelligent Computing (ICIC), 2025
Main:10 Pages
1 Figures
5 Tables
Abstract

Text classification assigns text to predefined categories. Traditional methods struggle with complex structures and long-range dependencies. Deep learning with recurrent neural networks and Transformer models has improved feature extraction and context awareness. However, these models still trade off interpretability, efficiency and contextual range. We propose the Dynamic Bidirectional Elman Attention Network (DBEAN). DBEAN combines bidirectional temporal modeling and self-attention. It dynamically weights critical input segments and preserves computational efficiency.

View on arXiv
Comments on this paper