ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.08262
  4. Cited By
Pretraining and Updating Language- and Domain-specific Large Language
  Model: A Case Study in Japanese Business Domain

Pretraining and Updating Language- and Domain-specific Large Language Model: A Case Study in Japanese Business Domain

12 April 2024
Kosuke Takahashi
Takahiro Omi
Kosuke Arima
Tatsuya Ishigaki
ArXivPDFHTML

Papers citing "Pretraining and Updating Language- and Domain-specific Large Language Model: A Case Study in Japanese Business Domain"

2 / 2 papers shown
Title
Fine-tuned Language Models are Continual Learners
Fine-tuned Language Models are Continual Learners
Thomas Scialom
Tuhin Chakrabarty
Smaranda Muresan
CLL
LRM
134
116
0
24 May 2022
The Pile: An 800GB Dataset of Diverse Text for Language Modeling
The Pile: An 800GB Dataset of Diverse Text for Language Modeling
Leo Gao
Stella Biderman
Sid Black
Laurence Golding
Travis Hoppe
...
Horace He
Anish Thite
Noa Nabeshima
Shawn Presser
Connor Leahy
AIMat
236
1,508
0
31 Dec 2020
1