95

Teach Me Sign: Stepwise Prompting LLM for Sign Language Production

International Conference on Information Photonics (ICIP), 2025
Zhaoyi An
Rei Kawakami
Main:5 Pages
3 Figures
Bibliography:1 Pages
2 Tables
Abstract

Large language models, with their strong reasoning ability and rich knowledge, have brought revolution to many tasks of AI, but their impact on sign language generation remains limited due to its complexity and unique rules. In this paper, we propose TEAch Me Sign (TEAM-Sign), treating sign language as another natural language. By fine-tuning an LLM, we enable it to learn the correspondence between text and sign language, and facilitate generation. Considering the differences between sign and spoken language, we employ a stepwise prompting strategy to extract the inherent sign language knowledge within the LLM, thereby supporting the learning and generation process. Experimental results on How2Sign and Phoenix14T datasets demonstrate that our approach effectively leverages both the sign language knowledge and reasoning capabilities of LLM to align the different distribution and grammatical rules between sign and spoken language.

View on arXiv
Comments on this paper