GUARANTEED DETERMINISTIC BOUNDS ON THE TOTAL VARIATION DISTANCE BETWEEN UNIVARIATE MIXTURES

2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP)(2018)

引用 3|浏览1
暂无评分
摘要
The total variation distance is a core statistical distance between probability measures that satisfies the metric axioms, with value always falling in [0,1]. Since the total variation distance does not admit closed-form expressions for statistical mixtures, one often has to rely in practice on costly numerical integrations or on fast Monte Carlo approximations that however do not guarantee deterministic bounds. In this work, we consider two methods for bounding the total variation of univariate mixture models: The first is based on the information monotonicity property of the total variation to design guaranteed nested deterministic lower bounds. The second method relies on computing the geometric lower and upper envelopes of weighted mixture components to derive deterministic bounds based on density ratio. We demonstrate the tightness of our bounds through simulating Gaussian, Gamma and Rayleigh mixture models.
更多
查看译文
关键词
Total variation,mixture models,information theoretic bounds
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要