ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.11473
  4. Cited By
AnchiBERT: A Pre-Trained Model for Ancient ChineseLanguage Understanding
  and Generation

AnchiBERT: A Pre-Trained Model for Ancient ChineseLanguage Understanding and Generation

24 September 2020
Huishuang Tian
Kexin Yang
Dayiheng Liu
Jiancheng Lv
ArXivPDFHTML

Papers citing "AnchiBERT: A Pre-Trained Model for Ancient ChineseLanguage Understanding and Generation"

4 / 4 papers shown
Title
WenyanGPT: A Large Language Model for Classical Chinese Tasks
WenyanGPT: A Large Language Model for Classical Chinese Tasks
Xinyu Yao
Mengdi Wang
Bo Chen
Xiaobing Zhao
67
0
0
29 Apr 2025
LLM-based multi-agent poetry generation in non-cooperative environments
LLM-based multi-agent poetry generation in non-cooperative environments
Ran Zhang
Steffen Eger
LLMAG
31
5
0
05 Sep 2024
SikuGPT: A Generative Pre-trained Model for Intelligent Information
  Processing of Ancient Texts from the Perspective of Digital Humanities
SikuGPT: A Generative Pre-trained Model for Intelligent Information Processing of Ancient Texts from the Perspective of Digital Humanities
Chang Liu
Dongbo Wang
Zhixiao Zhao
Die Hu
Mengcheng Wu
...
Si Shen
Bin Li
Jiangfeng Liu
Hai Zhang
Lianzheng Zhao
17
9
0
16 Apr 2023
Pre-Training BERT on Arabic Tweets: Practical Considerations
Pre-Training BERT on Arabic Tweets: Practical Considerations
Ahmed Abdelali
Sabit Hassan
Hamdy Mubarak
Kareem Darwish
Younes Samih
20
96
0
21 Feb 2021
1