
Title |
|---|
![]() Sneaking Syntax into Transformer Language Models with Tree RegularizationNorth American Chapter of the Association for Computational Linguistics (NAACL), 2024 |
![]() On Eliciting Syntax from Language Models via HashingConference on Empirical Methods in Natural Language Processing (EMNLP), 2024 |
![]() Structured Attention NetworksInternational Conference on Learning Representations (ICLR), 2017 |