ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2509.09660
  4. Cited By
Steering MoE LLMs via Expert (De)Activation

Steering MoE LLMs via Expert (De)Activation

11 September 2025
Mohsen Fayyaz
Ali Modarressi
Hanieh Deilamsalehy
Franck Dernoncourt
Ryan Rossi
Trung Bui
Hinrich Schutze
Nanyun Peng
    MoELLMSV
ArXiv (abs)PDFHTMLGithub (13★)

Papers citing "Steering MoE LLMs via Expert (De)Activation"

3 / 3 papers shown
Title
Multilingual Routing in Mixture-of-Experts
Multilingual Routing in Mixture-of-Experts
Lucas Bandarkar
Chenyuan Yang
Mohsen Fayyaz
Junlin Hu
Nanyun Peng
MoE
24
0
0
06 Oct 2025
ASGuard: Activation-Scaling Guard to Mitigate Targeted Jailbreaking Attack
ASGuard: Activation-Scaling Guard to Mitigate Targeted Jailbreaking Attack
Yein Park
Jungwoo Park
Jaewoo Kang
16
0
0
30 Sep 2025
Defending MoE LLMs against Harmful Fine-Tuning via Safety Routing Alignment
Defending MoE LLMs against Harmful Fine-Tuning via Safety Routing Alignment
Jaehan Kim
Minkyoo Song
Seungwon Shin
Sooel Son
MoE
27
0
0
26 Sep 2025
1