Diffusion Information Theoretic Learning for Distributed Estimation Over Network

IEEE Transactions on Signal Processing(2013)

引用 93|浏览14
暂无评分
摘要
Distributed estimation over networks has received a lot of attention due to its broad applicability. In diffusion type of distributed estimation, the parameters of interest can be well estimated from noisy measurements through diffusion cooperation between nodes. Meanwhile, the consumption of communication resources is low, since each node exchanges information only with its neighbors. In previous studies, most of the cost functions used in diffusion distributed estimation are based on mean square error (MSE) criterion, which is optimal only when the measurement noise is Gaussian. However, this condition does not always hold in real-world environments. In non-Gaussian cases, the information theoretic learning (ITL) provides a more general framework and has a better performance than the MSE-based method. In this work, we incorporate information theoretic measure into the cost function of diffusion distributed estimation. Moreover, an information theoretic measure based adaptive diffusion strategy is proposed to further promote estimation performance. Simulation results show that the diffusion ITL-based distributed estimation method can achieve superior performance comparing to the standard diffusion least mean square (LMS) algorithm when the noise is modeled to be non-Gaussian.
更多
查看译文
关键词
distributed estimation,diffusion distributed estimation,diffusion lms,adaptive diffusion strategy,cooperative communication,learning (artificial intelligence),adaptive signal processing,least mean squares methods,mse,diffusion information theoretic learning,ebm,error entropy criterion,mee,information theoretic learning,lms algorithm,diffusion cooperation,least mean square algorithm,adaptive estimation,nongaussian noise,information exchange,entropy,cost function,mean square error method,measurement noise,learning artificial intelligence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要