Designing Neural Networks Using Logical Specs

2020 IEEE 31st International Symposium on Software Reliability Engineering (ISSRE)(2020)

引用 1|浏览20
暂无评分
摘要
As systems that deploy machine learning models become more and more pervasive, there is an urgent need to ensure their reliability and safety. While recent years have seen a lot of progress in techniques for verification and validation of machine learning models, reasoning about and explaining their behaviors remains challenging. This paper introduces a new approach for creating machine learning models where instead of the traditional supervised learning using data, the models are directly synthesized from specifications, and thus, are correct by construction. Our focus is binary classifiers with boolean features. Specifically, our approach translates relational specifications written in the well-known modeling language Alloy to neural networks that run on the widely used Tensorflow backend. Our key insight is that a slight enhancement of traditional boolean gates can provide a rich intermediate representation that readily translates to neural networks. To translate the enhanced gates to neural networks, we employ a state-of-the-art program synthesis framework that allows us to find minimal neural networks. The translation of Alloy specifications then follows the standard translation to boolean logic with the exception of utilizing the enhanced boolean gates, followed by a translation to neural networks. We embody our approach in a prototype tool and use it for experimental evaluation. The experimental results show that our approach allows synthesis of neural networks that are hard to create using traditional training methods.
更多
查看译文
关键词
Relational properties, Program Synthesis, Neural Networks, First order logic, Alloy, Verification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要