Maximum Score Routing For Mixture-of-ExpertsAnnual Meeting of the Association for Computational Linguistics (ACL), 2025 |
DIVE into MoE: Diversity-Enhanced Reconstruction of Large Language Models from Dense into Mixture-of-ExpertsAnnual Meeting of the Association for Computational Linguistics (ACL), 2025 |
ABC-FHE : A Resource-Efficient Accelerator Enabling Bootstrappable Parameters for Client-Side Fully Homomorphic EncryptionDesign Automation Conference (DAC), 2025 |