Improving Calibration of BatchEnsemble with Data Augmentation

ICML Workshop on Uncertainty and Robustness in Deep Learning(2020)

引用 0|浏览52
暂无评分
摘要
Efficient ensembles such as BatchEnsemble are a simple drop-in approach to improving a model’s accuracy and calibration across in-and out-ofdistribution data. While they bridge the performance gap between single model performance and independent deep ensembles, their improvements on calibration are not as substantial as that on accuracy. We examine how to further improve calibration of these models. We investigate the role of data augmentation and show that augmentation techniques which improve single models can surprisingly make the ensemble calibration even worse. We propose a new data augmentation that fixes this pathology and improves BatchEnsemble’s calibration. We empirically demonstrate the effectiveness of our approaches on in-and out-of-distribution CIFAR-10, CIFAR-100.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要