Contextuality of Misspecification and Data-Dependent Losses
STATISTICAL SCIENCE(2016)
摘要
We elaborate on Watson and Holmes' observation that misspecification is contextual: a model that is wrong can still be adequate in one prediction context, yet grossly inadequate in another. One can incorporate such phenomena by adopting a generalized posterior, in which the likelihood is multiplied by an exponentiated loss. We argue that Watson and Holmes' characterization of such generalized posteriors does not really explain their good practical performance, and we provide an alternative explanation which suggests a further extension of the method.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络