Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2310.16240
Cited By
Mixture-of-Linguistic-Experts Adapters for Improving and Interpreting Pre-trained Language Models
24 October 2023
Raymond Li
Gabriel Murray
Giuseppe Carenini
MoE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Mixture-of-Linguistic-Experts Adapters for Improving and Interpreting Pre-trained Language Models"
4 / 4 papers shown
Title
Efficiently Democratizing Medical LLMs for 50 Languages via a Mixture of Language Family Experts
Guorui Zheng
Xidong Wang
Juhao Liang
Nuo Chen
Yuping Zheng
Benyou Wang
MoE
30
5
0
14 Oct 2024
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
280
3,843
0
18 Apr 2021
Stanza: A Python Natural Language Processing Toolkit for Many Human Languages
Peng Qi
Yuhao Zhang
Yuhui Zhang
Jason Bolton
Christopher D. Manning
AI4TS
199
1,652
0
16 Mar 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
294
6,943
0
20 Apr 2018
1