Hyperparameter Optimization for AST Differencing

IEEE TRANSACTIONS ON SOFTWARE ENGINEERING(2023)

引用 4|浏览27
暂无评分
摘要
Computing the differences between two versions of the same program is an essential task for software development and software evolution research. AST differencing is the most advanced way of doing so, and an active research area. Yet, AST differencing algorithms rely on configuration parameters that may have a strong impact on their effectiveness. In this paper, we present a novel approach named DAT (Diff Auto Tuning) for hyperparameter optimization of AST differencing. We thoroughly state the problem of hyper-configuration for AST differencing. We evaluate our data-driven approach DAT to optimize the edit-scripts generated by the state-of-the-art AST differencing algorithm named GumTree in different scenarios. DAT is able to find a new configuration for GumTree that improves the edit-scripts in 21.8% of the evaluated cases.
更多
查看译文
关键词
Training,Software algorithms,Computer bugs,Syntactics,Maintenance engineering,Hyperparameter optimization,Software,Software evolution,Tree differencing,Abstract Syntax Trees (AST),hyperparameter optimization, edit-script
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要