A-optimality orthogonal forward regression algorithm using branch and bound.

Neural Networks, IEEE Transactions(2008)

引用 4|浏览0
暂无评分
摘要
In this brief, we propose an orthogonal forward regression (OFR) algorithm based on the principles of the branch and bound (BB) and A-optimality experimental design. At each forward regression step, each candidate from a pool of candidate regressors, referred to as S, is evaluated in turn with three possible decisions: 1) one of these is selected and included into the model; 2) some of these remain in S for evaluation in the next forward regression step; and 3) the rest are permanently eliminated from S . Based on the BB principle in combination with an A-optimality composite cost function for model structure determination, a simple adaptive diagnostics test is proposed to determine the decision boundary between 2) and 3). As such the proposed algorithm can significantly reduce the computational cost in the A-optimality OFR algorithm. Numerical examples are used to demonstrate the effectiveness of the proposed algorithm.
更多
查看译文
关键词
neural network,regression step,decision boundary,statistical testing,computational cost,branch and bound (bb),a-optimality composite cost function,tree searching,adaptive diagnostics test,regression analysis,forward regression,candidate regressors,composite cost function,structure identification,bb principle,orthogonal forward regression algorithm,proposed algorithm,a-optimality orthogonal forward regression,experimental design,design of experiments,model structure determination,a-optimality ofr algorithm,branch-and-bound principle,a-optimality experimental design,neural nets,indexing terms,linear programming,cost function,neural networks,nonlinear systems,branch and bound,convergence,testing,parallel processing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要