Buffer Overflow in Mixture of Experts
CoRR(2024)
摘要
Mixture of Experts (MoE) has become a key ingredient for scaling large
foundation models while keeping inference costs steady. We show that expert
routing strategies that have cross-batch dependencies are vulnerable to
attacks. Malicious queries can be sent to a model and can affect a model's
output on other benign queries if they are grouped in the same batch. We
demonstrate this via a proof-of-concept attack in a toy experimental setting.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要