The generation of complex derived word forms has been an overlooked problem in NLP; we fill this gap by applying neural sequence-to-sequence models to the task. We overview the theoretical motivation for a paradigmatic treatment of derivational morphology, and introduce the task of derivational paradigm completion as a parallel to inflectional paradigm completion. State-of-the-art neural models, adapted from the inflection task, are able to learn a range of derivation patterns, and outperform a non-neural baseline by 16.4%. However, due to semantic, historical, and lexical considerations involved in derivational morphology, future work will be needed to achieve performance parity with inflection-generating systems.
View on arXiv@article{cotterell2025_1708.09151, title={ Paradigm Completion for Derivational Morphology }, author={ Ryan Cotterell and Ekaterina Vylomova and Huda Khayrallah and Christo Kirov and David Yarowsky }, journal={arXiv preprint arXiv:1708.09151}, year={ 2025 } }