
Title |
|---|
![]() Towards a Unified View of Parameter-Efficient Transfer LearningInternational Conference on Learning Representations (ICLR), 2021 |
![]() DEMix Layers: Disentangling Domains for Modular Language ModelingNorth American Chapter of the Association for Computational Linguistics (NAACL), 2021 |
![]() LoRA: Low-Rank Adaptation of Large Language ModelsInternational Conference on Learning Representations (ICLR), 2021 |
![]() Compacter: Efficient Low-Rank Hypercomplex Adapter LayersNeural Information Processing Systems (NeurIPS), 2021 |
![]() Fast, Effective, and Self-Supervised: Transforming Masked Language
Models into Universal Lexical and Sentence EncodersConference on Empirical Methods in Natural Language Processing (EMNLP), 2021 |
![]() Mind the Gap: Assessing Temporal Generalization in Neural Language
ModelsNeural Information Processing Systems (NeurIPS), 2021 |
![]() AdapterFusion: Non-Destructive Task Composition for Transfer LearningConference of the European Chapter of the Association for Computational Linguistics (EACL), 2020 |