Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.13474
Cited By
Adapt-and-Distill: Developing Small, Fast and Effective Pretrained Language Models for Domains
25 June 2021
Yunzhi Yao
Shaohan Huang
Wenhui Wang
Li Dong
Furu Wei
VLM
ALM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Adapt-and-Distill: Developing Small, Fast and Effective Pretrained Language Models for Domains"
6 / 6 papers shown
Title
Efficient Domain-adaptive Continual Pretraining for the Process Industry in the German Language
Anastasia Zhukova
Christian E. Matt
Terry Ruas
Bela Gipp
CLL
VLM
98
0
0
28 Apr 2025
The Rise of Small Language Models in Healthcare: A Comprehensive Survey
Muskan Garg
Shaina Raza
Shebuti Rayana
Xingyi Liu
Sunghwan Sohn
LM&MA
AILaw
87
0
0
23 Apr 2025
SaulLM-7B: A pioneering Large Language Model for Law
Pierre Colombo
T. Pires
Malik Boudiaf
Dominic Culver
Rui Melo
...
Andre F. T. Martins
Fabrizio Esposito
Vera Lúcia Raposo
Sofia Morgado
Michael Desa
ELM
AILaw
39
63
0
06 Mar 2024
AdaSent: Efficient Domain-Adapted Sentence Embeddings for Few-Shot Classification
Yongxin Huang
Kexin Wang
Sourav Dutta
Raj Nath Patel
Goran Glavas
Iryna Gurevych
VLM
13
4
0
01 Nov 2023
DREEAM: Guiding Attention with Evidence for Improving Document-Level Relation Extraction
Youmi Ma
An Wang
Naoaki Okazaki
23
61
0
17 Feb 2023
Unsupervised Domain Adaptation for Sparse Retrieval by Filling Vocabulary and Word Frequency Gaps
Hiroki Iida
Naoaki Okazaki
34
4
0
08 Nov 2022
1