Deep Specification Mining with Attention.
COCOON(2020)
摘要
In this paper, we improve the method of specification mining based on deep learning proposed in [16]. In that neural network model, we find that if the length of a single trace exceeds 25 and the number of the tracking methods exceeds 15, the \\(F_{measure}\\) output of the original model will decrease significantly. Accordingly, we propose a new model with attention mechanism to solve the forgetting problem of the original model for long sequence learning. First of all, test cases are used to generate as many as possible program traces, each of which covers a complete execution path. The trace set is then used for training a language model based on Recurrent Neural Networks (RNN) and attention mechanism. From these trajectories, a Prefix Tree Acceptor (PTA) is built and features are extracted using the new proposed model. Then, these features are used by clustering algorithms to merge similar states in the PTA to build multiple finite automata. Finally, a heuristic algorithm is used to evaluate the quality of these automata and select the one with the highest \\(F_{measure}\\) as the final specification automaton.
更多查看译文
关键词
attention,deep
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络