Neural Lower Bounds for Verification

2023 IEEE Conference on Secure and Trustworthy Machine Learning (SaTML)(2023)

引用 0|浏览12
暂无评分
摘要
Recent years have witnessed the deployment of branch-and-bound (BaB) frameworks for formal verification in deep learning-proving or disproving a desirable property of a neural network. The main computational bottleneck of BaB is the estimation of lower bounds via convex relaxations. Past work in this field has relied on traditional optimization algorithms whose inefficiencies have limited their scope. To alleviate this deficiency, we propose a novel graph neural network (GNN) based approach. Our GNN architecture closely resembles the network we wish to verify. During inference, it performs forward-backward passes through the GNN layers to compute a feasible dual solution of the convex relaxation, thereby providing a valid lower bound. During training, its parameters are estimated via a loss function that encourages large lower bounds over a time horizon. Using standard publicly available data sets, we show that our approach provides a significant speedup for formal verification compared to the state of the art solvers. Moreover, the GNN achieves good generalization performance on unseen networks.
更多
查看译文
关键词
Formal Neural Network Verification,Adversarial Robustness,Graph Neural Networks,Optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要