ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.04711
  4. Cited By
Pre-train or Annotate? Domain Adaptation with a Constrained Budget

Pre-train or Annotate? Domain Adaptation with a Constrained Budget

10 September 2021
Fan Bai
Alan Ritter
Wei-ping Xu
ArXivPDFHTML

Papers citing "Pre-train or Annotate? Domain Adaptation with a Constrained Budget"

5 / 5 papers shown
Title
Efficient Domain-adaptive Continual Pretraining for the Process Industry in the German Language
Efficient Domain-adaptive Continual Pretraining for the Process Industry in the German Language
Anastasia Zhukova
Christian E. Matt
Terry Ruas
Bela Gipp
CLL
VLM
91
0
0
28 Apr 2025
LUK: Empowering Log Understanding with Expert Knowledge from Large Language Models
LUK: Empowering Log Understanding with Expert Knowledge from Large Language Models
Lipeng Ma
Weidong Yang
Sihang Jiang
Ben Fei
Mingjie Zhou
Shuhao Li
Bo Xu
Bo Xu
Yanghua Xiao
36
0
0
03 Sep 2024
Carbon Emissions and Large Neural Network Training
Carbon Emissions and Large Neural Network Training
David A. Patterson
Joseph E. Gonzalez
Quoc V. Le
Chen Liang
Lluís-Miquel Munguía
D. Rothchild
David R. So
Maud Texier
J. Dean
AI4CE
233
626
0
21 Apr 2021
A Frustratingly Easy Approach for Entity and Relation Extraction
A Frustratingly Easy Approach for Entity and Relation Extraction
Zexuan Zhong
Danqi Chen
138
108
0
24 Oct 2020
Knowledge Enhanced Contextual Word Representations
Knowledge Enhanced Contextual Word Representations
Matthew E. Peters
Mark Neumann
IV RobertL.Logan
Roy Schwartz
Vidur Joshi
Sameer Singh
Noah A. Smith
213
651
0
09 Sep 2019
1