
![]() Keep Decoding Parallel with Effective Knowledge Distillation from
Language Models to End-to-end Speech RecognisersIEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2024 |
![]() Confidence Preservation Property in Knowledge Distillation AbstractionsSGAI Conferences (SGAI), 2024 |
![]() Location Aware Modular Biencoder for Tourism Question AnsweringInternational Joint Conference on Natural Language Processing (IJCNLP), 2024 |
![]() Building Variable-sized Models via Learngene PoolAAAI Conference on Artificial Intelligence (AAAI), 2023 |
![]() Transformer as Linear Expansion of LearngeneAAAI Conference on Artificial Intelligence (AAAI), 2023 |
![]() PEA-Diffusion: Parameter-Efficient Adapter with Knowledge Distillation
in non-English Text-to-Image GenerationEuropean Conference on Computer Vision (ECCV), 2023 |
![]() Knowledge Distillation Based Semantic Communications For Multiple UsersIEEE Transactions on Wireless Communications (IEEE TWC), 2023 |