LMS-2: Towards an algorithm that is as cheap as LMS and almost as efficient as RLS

CDC(2009)

引用 3|浏览15
暂无评分
摘要
We consider linear prediction problems in a stochastic environment. The least mean square (LMS) algorithm is a well-known, easy to implement and computationally cheap solution to this problem. However, as it is well known, the LMS algorithm, being a stochastic gradient descent rule, may converge slowly. The recursive least squares (RLS) algorithm overcomes this problem, but its computational cost is quadratic in the problem dimension. In this paper we propose a two timescale stochastic approximation algorithm which, as far as its slower timescale is considered, behaves the same way as the RLS algorithm, while it is as cheap as the LMS algorithm. In addition, the algorithm is easy to implement. The algorithm is shown to give estimates that converge to the best possible estimate with probability one. The performance of the algorithm is tested in two examples and it is found that it may indeed offer some performance gain over the LMS algorithm.
更多
查看译文
关键词
computational cost,approximation theory,timescale stochastic approximation,stochastic gradient descent rule,linear prediction problem,lms-2,stochastic environment,least mean squares methods,gradient methods,least mean square algorithm,recursive least squares algorithm,stochastic gradient descent,least squares approximation,schedules,convergence,least mean square,lms algorithm,approximation algorithms
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要