ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2510.18434
318
0
v1v2v3 (latest)

Chain-of-Conceptual-Thought Elicits Daily Conversation in Large Language Models

21 October 2025
Qingqing Gu
Dan Wang
Yue Zhao
Xiaoyu Wang
Zhonglin Jiang
Yong Chen
Hongyan Li
Luo Ji
    ReLMLRM
ArXiv (abs)PDFHTML
Main:8 Pages
4 Figures
Bibliography:2 Pages
16 Tables
Appendix:5 Pages
Abstract

Chain-of-Thought (CoT) is widely applied to enhance the LLM capability in math, coding and reasoning tasks. However, its performance is limited for open-domain tasks, when there are no clearly defined reasoning steps or logical transitions. To mitigate such challenges, we propose a new prompt-based paradigm called Chain of Conceptual Thoughts (CoCT), which suggests the LLM first to produce the tag of concepts, then complete the detailed content following the concept. To encourage this hierarchical way of thinking, we implement the concepts with emotions, strategies and topics. We experiment with this paradigm in daily and emotional support conversations, covering tasks with both in-domain and out-of-domain concept settings. Automatic, human, and LLM-based evaluations reveal that CoCT surpasses several prompt-based baselines such as self-refine, ECoT, SoT and RAG, suggesting a potential solution of LLM prompting paradigm for a wider scope of tasks.

View on arXiv
Comments on this paper