On Variational Bounds of Mutual Information

international conference on machine learning, 2019.

Cited by: 53|Bibtex|Views184
EI
Other Links: dblp.uni-trier.de|academic.microsoft.com|arxiv.org

Abstract:

Estimating and optimizing Mutual Information (MI) is core to many problems in machine learning; however, bounding MI in high dimensions is challenging. To establish tractable and scalable objectives, recent work has turned to variational bounds parameterized by neural networks, but the relationships and tradeoffs between these bounds re...More

Code:

Data:

Full Text
Your rating :
0

 

Tags
Comments